|start | find | index | login or register | edit|
Dienstag, 28. November 2006 link
Tom White and Doug Cutting are happily hacking at two Hadoop issues (574 and 571) that will allow Hadoop's distributed filesystem to be initialized from Amazon S3 and to commit results back to S3. Combine that with Hadoop's Map/Reduce implementation and run the whole Hadoop package (hdfs and mapred) on an Amazon EC2 cluster (see Amazon EC2 at the Hadoop wiki) and you got your very own cluster computing engine set up and running.
3 active users
|earl.strain.at • esa3 • online for 6972 days • c'est un vanilla site|