constvisitedLinks=newSet(),visitedIds=newSet();// Just in case somebody gets tricky and adds unique links that have duplicate IDsasyncfunctionvisitLink(link){lettotal=0;if(!visitedLinks.has(link)){visitedLinks.add(link);const{id,content,links}=await(awaitfetch(link)).json();if(!visitedIds.has(id)){visitedIds.add(id);total+=parseFloat(content.match(/\$[\d,]+(?:[,.]\d+)?/)[0].substring(1).replace(',','.'));for(constlinkoflinks){total+=awaitvisitLink(link);}}}returntotal;}visitLink('https://gist.githubusercontent.com/jorinvo/6f68380dd07e5db3cf5fd48b2465bb04/raw/c02b1e0b45ecb2e54b36e4410d0631a66d474323/fd0d929f-966f-4d1a-89cd-feee5a1c5347.json').then(console.log);// 9064.78999999999 (JS rounding errors, yay)
I'm a Sr. Software Engineer at Flashpoint. I specialize in Python and Go, building functional, practical, and maintainable web systems leveraging Kubernetes and the cloud. Blog opinions are my own.
I love whenever anybody posts challenges like this! Also, having the decimal separator be both . and , was tricksy.
require'bigdecimal'# Use decimals when money is involvedrequire'json'require'net/http'require'set'# Hunts down linked transactions without double-counting themclassTransactionFinderattr_reader:founddefinitialize(first_url)@pending=[first_url]@found={# 'fd0d929f'... : '$1699,15'}@hit_uris=Set.newenddefhuntuntil@pending.empty?current=@pending.popnextif@hit_uris.include?(current)@hit_uris.add(current)result=get(current)@found[result['id']]=result['content'].scan(/\$[0-9,.]+/).first@pending.concat(result['links'])endenddefget(uri)result_string=Net::HTTP.get_response(URI(uri))result=JSON.parse(result_string.body)enddefwrite_transactions(filename)File.open(filename,'w'){|f|f.write(JSON.pretty_generate(@found))}enddeftotal@found.values.reduce(BigDecimal.new('0'))do|sum,current|dollars,cents=current.scan(/[0-9]+/)sum+BigDecimal.new("#{dollars}.#{cents}")endendendfirst_uri='https://gist.githubusercontent.com/jorinvo/6f68380dd07e5db3cf5fd48b2465bb04/raw/c02b1e0b45ecb2e54b36e4410d0631a66d474323/fd0d929f-966f-4d1a-89cd-feee5a1c5347.json't=TransactionFinder.new(first_uri)t.huntt.write_transactions('data.json')puts"$#{t.total.to_f}"
Nice! I wrote a similar version but using a sync.WaitGroup and a separate constant number of workers to parallelize the download. You can find it here.
One possible way is to further optimize by let different "layers" of json object urls running "concurrently". Nevertheless, I haven't come up with an actual implementation (as you can see, right now my implementation only crawl one by one "layer" of gist urls)
Loving it!
My take:
In JavaScript:
I love whenever anybody posts challenges like this! Also, having the decimal separator be both
.and,was tricksy.SPOILER
$9064.79
My solution using goroutines for speed :)
P/s: Sorry for the ugly code :D It was written in a hurry
Nice! I wrote a similar version but using a
sync.WaitGroupand a separate constant number of workers to parallelize the download. You can find it here.One possible way is to further optimize by let different "layers" of json object urls running "concurrently". Nevertheless, I haven't come up with an actual implementation (as you can see, right now my implementation only crawl one by one "layer" of gist urls)
My quick PHP solution
Here's a Scala version, asynchronous, concurrent, non-blocking with async/await.
Using dispatch for requests and circe for json decoding.
Hi! I think i solved it :) and it was very funny!
End of track: $146.091,89
The code is in my repo:
github.com/DiegoMGar/LearningChall...
Really had tons of fun solving it! Thanks for learning opportunity :D
Interesting! Im doing it :)
TY
You can have a look at some existing solutions over here.