Dec 8, 2014
Tonight on Twitter, @jessitron posted the following:
The other week, I needed to do exactly this, and came up with a slightly different approach. Here’s my (gently elaborated) version of Jessica’s snippet:
%w(a b a).reduce(Hash.new) do |map, item| map.merge(item => 1) do |key, sum, increment| sum + increment end end
I don’t know that it’s better or worse, but it has two properties I like:
merge’s ability to take a block to resolve conflicting updates.1
The last point is the linchpin of the block.
When we merge in a new
item key into
map, its value is set to
If we encounter that key again,
merge sets the value
key to the result of its block.
The block itself takes three arguments: the conflicting key, the existing value of that key, and the value we attempted to merge.
For our purposes, the key isn’t necessary; but notice that its existing value is simply the number of
item keys we’ve tried to merge in the past (starting with
1), and the value we’re attempting to merge is the value by which we increment the count.
Anyway, there’s not really a point to this post, other than that it can be fun to fart around with Enumerable on a sleety Monday night.
These days, I’d almost certainly do what Jessica wrote in her initial toot, with one small change:
%(a b a).each_with_object(Hash.new(0)) do |i, h| h[i] += 1 end
Why is this?
Well: these days, I actually understand what
In 2014, that was not the case.
My friend Ransom pointed out that since
merge performs a copy of the source Hash, the code I’ve written has polynomial complexity (traversal × merge). There’s no practical downside to changing it to
merge! which would avoid the copy. ↩