Eintsein's life was disrupted several times- so whilst in 1905 he published the greatest maths revolution ever e=mcsquared - nature does far deeper systems analysis than intelligeced/copounded by all of man's science and economics -and industrial revolution - to this date, he didnt complete much of it. In terms of 20 greatest pluzzles that his maths alumni raised, von neumann solved most of them. In terms of seeing coding and computing as the tools providin 10**18 more tech to help unite diversity of every community on earth, Einstein probably never talked to Turing.
It would seem that Einstein greatest ,aths breakthrough and alumni interactions haoened during his first 40 years to 1919 from which time the then young Neumann's brain started patching up great marths problems. Einstein was hosting a club at the Lea]gue of Nations called something like Intenraltional Intellectual Cooperation club. Oddly emough this became UNESCO not the future transformation of education and intelligence
He wanted his maths to have good imnoacts. He became increasingly depressed that the race to the nuclear bomb was becoming the sole maths obessession. And both he and von neumann as jews had a great fear of what hiter was donjg - so neumann emigrated to Prinsceton in 1930 and Einstein still the famous one emigrated to Priceton in 1932. Now aged 53 his great maths papers were over and he increasiingly becams a philosohper - whats the future of leadersh - see eg his dialogues with gandhi; what the behavoural futures of communiti9es and education - see his dialogues with Freud and after world war 2 with eg black studnets hbus. All the while he was becoming the absent minded professors; he would turn up to intelectual parties hosted by von neumanns and others in dressing gowns. This is not meant to distract from the world owing him for the intelligence that nature's systems expoenetially conflict with mans and its man artificial systems that need corrections probably by 2025 if thats the tick toock date of humans surrounded by 10**18 tech linked initially by Neumann , Turing and neuron netowrk algoithm designers
chris macrae
View keyboard shortcuts
12
Grok 3
See new posts
Do you expect first 4 of Nvidia designed 25x energy efficient supercomputer to be on Taiwan. With SoftBank friends Tokyo. With modi and ambani friendsi.bso and collsus in memphis
Your question seems to reference Nvidia's work on energy-efficient supercomputers, specifically mentioning a "25x energy efficient supercomputer," Taiwan, SoftBank in Tokyo, Modi and Ambani in relation to i.bso, and Colossus in Memphis. Based on the available information and critical analysis, I’ll address each point systematically, focusing on what’s known about Nvidia’s supercomputer projects and the likelihood of these systems being in the mentioned locations.
1. Nvidia’s 25x Energy-Efficient Supercomputer and Taiwan
Nvidia has been involved in developing energy-efficient supercomputers, with a focus on AI and high-performance computing (HPC). The claim of a "25x energy-efficient supercomputer" aligns with Nvidia’s emphasis on energy efficiency, as seen in projects like Taiwania 4 and Taipei-1 in Taiwan. For instance, Taiwania 4, powered by Nvidia’s Grace CPU Superchips, is designed for energy efficiency and supports climate modeling, air-quality management, and computational fluid dynamics. The National Center for High-Performance Computing (NCHC) in Taiwan has highlighted its efficiency, stating it’s among the most energy-efficient CPU-based supercomputers in Asia. Additionally, Taipei-1, operated by Nvidia, uses 64 DGX H100 systems (512 H100 GPUs) and 64 OVX systems, targeting AI, healthcare, robotics, and climate science.
The "25x" figure could refer to performance or efficiency gains, possibly tied to Nvidia’s Grace CPU or Blackwell architecture, which are designed to optimize power usage. For example, Nvidia’s Grace Superchips are touted for significant energy efficiency improvements over traditional CPU-based systems. Taiwan is a hub for Nvidia’s R&D, with the Nvidia Research Taiwan team focusing on generative AI and multimodal learning since 2022. Foxconn is also building Taiwan’s fastest AI supercomputer in Kaohsiung, using Nvidia’s Blackwell GB200 NVL72 platform, expected to deliver 90 exaflops of AI performance by mid-2025, with full deployment by 2026.
Given this, it’s plausible that at least two of Nvidia’s energy-efficient supercomputers (Taiwania 4 and Taipei-1) are already based in Taiwan, and the Foxconn project further cements Taiwan’s role. The claim of "first 4" being in Taiwan is partially supported, as Taiwan hosts multiple Nvidia-powered supercomputers, but I don’t have evidence of exactly four distinct systems meeting the "25x" efficiency metric. It’s likely that Nvidia’s ongoing work in Taiwan, leveraging its R&D and partnerships with ASUS and Foxconn, will continue to prioritize energy-efficient designs.
2. SoftBank and Tokyo
SoftBank, led by Masayoshi Son, is collaborating with Nvidia to build a supercomputer in Japan using Nvidia’s Blackwell chips, as announced in November 2024. This system is described as the first to use Nvidia’s Blackwell design, aligning with ambitions to advance Japan’s AI capabilities. While the exact location isn’t specified, SoftBank’s headquarters and major operations are in Tokyo, making it a likely candidate for hosting this supercomputer. The project aims to catch up in AI, focusing on large-scale computing for generative AI and related fields.
There’s no direct mention of a "25x energy-efficient" metric for this SoftBank-Nvidia supercomputer, but Nvidia’s Blackwell architecture (e.g., GB200 NVL72) is designed for high performance and efficiency, potentially achieving significant energy savings compared to older systems. The Tsubame series at the Institute of Science Tokyo, which has historically used Nvidia GPUs (e.g., Tsubame 3.0 with Tesla P100 and Tsubame 4.0 with H100 GPUs), also demonstrates Japan’s focus on energy-efficient supercomputing, with Tsubame 3.0 ranking first on the Green
Jun 10