News

Google's distributed computing for dummies trains ResNet-50 in under half an hour Google's new "TF-Replicator" technology is meant to be drop-dead simple distributed computing for AI researchers.
Anyscale, a startup founded by a team out of UC Berkeley that created the Ray open-source Python framework for running distributed computing projects, has raised $40 million.
Nvidia wants to extend the success of the GPU beyond graphics and deep learning to the full data science experience. Open source Python library Dask is the key to this.
Anyscale, the startup behind the open source project Ray, today closed a $40 million round to support its first commercial offering, a managed Ray platform.
Is distributed computing dying, or just fading into the background? There seems to be much less excitement about distributed computing these days.
In this video, Jan Meinke and Olav Zimmermann from the Jülich Supercomputing Centre present: High-Performance Computing with Python: Reducing Bottlenecks. This course addresses scientists with a ...