Background
Computing Power
Since the birth of AI-generated content, namely AIGC (GenAi), in the field of text-to-image, Stable Diffusion, and Midjourney applications, the design process and ecosystem have been significantly changed. Another major breakthrough of GenAi is the large language model represented by ChatGPT, which, for the first time, made fluent language interaction generated by machines possible and will also significantly change the business automation process and ecosystem. However, GenAi technology relies on GPU computing power, so commercializing AIGC needs significant GPU support.
Training large models requires hundreds to thousands of expensive computing servers like DGX A100 (equipped with 8 A100 GPUs), costing millions of dollars and several months. Additionally, tasks such as fine-tuning based on open-source models, verticalization, and the operational inference computing required by AIGC projects all need a lot of computing resources.

P1X and AINUR believe institutions with colossal computing resources, such as major internet platforms and governments, should not control large models because it is not conducive to AI "aligning" with all humans. If training large models and improving large model services can only align with these large institutions due to computing power issues, it will be a disaster for humanity. Moreover, large models also need to be in the hands of the people or DAOs for healthy development.
P1X and AINUR's Tinybrood incubation program offers such an attempt to change the status quo. The Tinybrood plan aims to reorganize civilian low-cost computing power from Crypto mining and other sources through technological and web3 model innovations to form new AI computing resources. On this basis, whether it's AI startups or research institutions, they can develop AI's potential under a truly human-aligned value system.
The Tinybrood incubation program will launch specific support plans.