[ad_1]
In a collaborative effort to advance large-scale artificial intelligence (AI) systems for scientific discovery, the newly formed Trillion Parameter Consortium (TPC) has gained a significant member in the National Center for Supercomputing Applications (NCSA), reported Cyber News. Comprising experts from esteemed research institutes, federal laboratories, academia, and industry, TPC aims to collectively build trillion-parameter-sized digital brains exclusively fueled by scientific information.
The open community of researchers, led by some of the brightest minds globally, seeks to pool knowledge, prevent duplication of efforts, and coordinate AI projects to maximize their impact.Envisaging a global network of resources and expertise, TPC focuses on creating state-of-the-art large language models (LLMs) for science and engineering.
The genesis of this collaboration dates back to the deployment of exascale computing platforms like Frontier, Aurora, and El Capitan in US Department of Energy laboratories. Recognizing the need for cooperation to develop models on par with private giants like OpenAI’s GPT-4, TPC emphasizes the trustworthiness and reliability of their AI models as the forefront of large-scale AI.
Rick Stevens, Argonne associate laboratory director, expressed the ongoing development of frontier AI models and the preparation of vast scientific datasets for training. The NCSA is pivotal in this initiative, introducing its AI-focused advanced computing and data resource, DeltaAI, set to triple the computing capacity and significantly expand the NSF-funded advanced computing ecosystem upon its launch in 2024.
Founding members, including Argonne National Laboratory, are concurrently working on their AI models. AuroraGPT, Argonne’s creation, aspires to become a comprehensive brain for scientific researchers after extensive training. The TPC collaboration seeks to harness global efforts in identifying high-quality training data, designing and evaluating model architectures, and innovating model evaluation strategies concerning bias, trustworthiness, and goal alignment.
The open community of researchers, led by some of the brightest minds globally, seeks to pool knowledge, prevent duplication of efforts, and coordinate AI projects to maximize their impact.Envisaging a global network of resources and expertise, TPC focuses on creating state-of-the-art large language models (LLMs) for science and engineering.
The genesis of this collaboration dates back to the deployment of exascale computing platforms like Frontier, Aurora, and El Capitan in US Department of Energy laboratories. Recognizing the need for cooperation to develop models on par with private giants like OpenAI’s GPT-4, TPC emphasizes the trustworthiness and reliability of their AI models as the forefront of large-scale AI.
Rick Stevens, Argonne associate laboratory director, expressed the ongoing development of frontier AI models and the preparation of vast scientific datasets for training. The NCSA is pivotal in this initiative, introducing its AI-focused advanced computing and data resource, DeltaAI, set to triple the computing capacity and significantly expand the NSF-funded advanced computing ecosystem upon its launch in 2024.
Founding members, including Argonne National Laboratory, are concurrently working on their AI models. AuroraGPT, Argonne’s creation, aspires to become a comprehensive brain for scientific researchers after extensive training. The TPC collaboration seeks to harness global efforts in identifying high-quality training data, designing and evaluating model architectures, and innovating model evaluation strategies concerning bias, trustworthiness, and goal alignment.
[ad_2]
Source link
More Stories
Google Maps: Three privacy features coming to Google Maps on Android, iPhones
Most-Downloaded IPhone App: This Chinese app was the most-downloaded iPhone app in the US in 2023
Ukraine’s largest mobile operator goes offline for millions of users after cyber attack