Sam Altman's Rain AI: Neuromorphic NPUs vs. NVIDIA GPUs
Will the Sam Altman-backed Rain AI's neuromorphic NPUs threaten NVIDIA GPUs?
Recently (November 2024), Sam Altman (CEO of OpenAI) made headlines seeking investors for Rain AI – a company specializing in high-performance computing via: neuromorphic processing units (NPUs), digital in-memory computing (D-IMC), and custom hardware that spans every layer of the AI stack.
Sam Altman initially made a personal investment in Rain AI for $1 million in 2018 and participated in a $25 million seed round in 2022.
In 2019, OpenAI signed a nonbinding agreement to spend $51 million on Rain AI chips when they became available – some viewed this as a potential conflict-of-interest for Altman.
Supporters believe Rain AI has serious potential to challenge the dominance of GPUs in AI hardware, presenting a disruptive alternative to NVIDIA’s stronghold.
Yet skeptics question whether Rain AI will: (1) demonstrate practical utility in real-world applications, (2) establish itself as the leading neuromorphic computing company, and (3) keep pace with NVIDIA GPUs, given their rapid rate of improvement and ease of integration into existing AI workflows.
Rain AI Valuation & Funding History (2024)
Rain AI operates as a private company, which means it is not listed on public exchanges such as NASDAQ or NYSE.
Investment opportunities are currently restricted to accredited and institutional investors, along with venture funds, investment syndicates, and pre-IPO marketplaces.
Current Valuation: ~$600 million, a significant leap from its $250-350 million valuation in August 2023.
This valuation is driven by its ongoing Series B financing round, led by Sam Altman and supported by institutional backers, including Microsoft.
The valuation increase highlights growing confidence in Rain AI’s potential to disrupt the AI chip market.
Funding History
Rain AI has exhibited a strong funding trajectory, underscoring its appeal among high-profile investors:
Total Funds Raised: $143.22 million across 12 rounds.
2022 Series A Round: Secured $25 million, with Altman as a major investor.
2024 Series B Round: Currently targeting $150 million, with notable backing from Microsoft.
Investment Access
Opportunities to invest in Rain AI are currently limited to:
Accredited Investors: Individuals meeting SEC-defined income or net worth thresholds.
Institutional Investors: Including venture capital firms, hedge funds, and investment syndicates.
Pre-IPO Marketplaces: Platforms where accredited investors can purchase private shares before an IPO.
These restrictions emphasize Rain AI’s focus on strategic growth while remaining privately held to refine its technology and expand its market presence before going public.
Revenue & Profitability
Rain AI’s profitability timeline hinges on successful product launches, likely 2026-2027.
Rain AI is pre-revenue, focusing on R&D and prototypes.
First customer-ready chips are planned for launch in 2025 (including OpenAI).
OpenAI’s $51 million commitment represents its initial revenue stream.
Rain AI seeks parterships with hyperscalers and semiconductor firms.
Sam Altman’s Motives for Investing in Rain AI (Most Likely)
Why did Altman invest in Rain AI initially? Why is he recruiting investors for Rain AI in 2024?
Some sources (e.g. NYPost) implied that Altman is promoting Rain AI to investors to challenge “Musk-friendly NVIDIA” (most are aware that Musk & Altman have a rivalry since things went sour at OpenAI).
It’s possible that this played some role (e.g. Altman can box out Musk if Rain AI becomes big), but I think it’s unlikely to be his primary motivation (especially considering he’s been an investor since 2019).
It’s far more likely that Altman is investing in Rain AI as a strategic hedge against NVIDIA GPUs and/or complementary technology with NVIDIA GPUs.
1. Strategic Hedge Against NVIDIA Dominance
Rationale: Reduce reliance on NVIDIA GPUs, which dominate AI hardware with high costs ($40,000+ per unit) and supply chain risks.
Details: Rain AI’s neuromorphic chips provide a backup solution, improving leverage in negotiations and diversifying OpenAI’s hardware sources.
Impact: Strengthens OpenAI’s operational resilience while fostering competition in the AI hardware market.
2. Competitive Edge for OpenAI
Rationale: Secure exclusive access to Rain AI’s technology, creating a barrier for rivals like xAI and Anthropic.
Details: OpenAI’s $51M pre-order locks in priority access to Rain AI’s 85% energy-efficient NPUs, ensuring scalability and cost savings.
Impact: Positions OpenAI as a leader in efficiency and scalability, leaving competitors at a disadvantage.
3. Complementary Technology to GPUs
Rationale: Address performance gaps where GPUs are less effective, such as energy-critical inference and edge AI tasks.
Details: Rain AI’s neuromorphic chips integrate processing and memory (Digital Dendrites Technology), reducing latency and power consumption.
Impact: Ideal for real-time applications like ChatGPT inference, robotics, and edge computing.
4. Future Path to AGI and/or ASI
Rationale: Explore neuromorphic computing’s potential to overcome GPU-based limitations for AGI or ASI.
Details: Brain-inspired architectures emulate neural adaptability, opening possibilities for breakthroughs in cognitive AI systems.
Impact: Positions OpenAI at the forefront of AGI/ASI development with novel hardware architectures.
5. Cost Efficiency for Large AI Models
Rationale: Lower operational costs for inference-heavy applications where GPUs are power-intensive.
Details: Rain AI claims 85% lower energy consumption, significantly reducing costs in scaling large models.
Impact: Enables cost-effective deployment of AI services at scale.
6. Disruption of Specialized Tech Sectors
Rationale: Enter high-growth sectors like robotics, drones, and defense where traditional GPUs are less competitive.
Details: Rain AI’s Digital In-Memory Computing (D-IMC) positions it as a disruptor in these niches.
Impact: Challenges incumbents like NVIDIA and Intel, diversifying the AI hardware landscape.
7. Long-Term Growth Opportunity (Neuromorphic TAM)
Rationale: Invest in a promising company with substantial market potential.
Details: Rain AI has raised $143M with a $600M valuation, targeting sectors like autonomous vehicles and IoT.
Impact: Aligns with Altman’s vision for scalable, energy-efficient AI solutions in emerging markets.
Jean-Didier Allegrucci & Rain AI: Possible Impact
Rain AI’s hiring of Jean-Didier Allegrucci, a key architect of Apple’s A-series and M-series chips, is potentially a game-changer.
At Apple, his designs set benchmarks for power efficiency, scalability, and hardware-software integration—qualities Rain AI must replicate to compete with Nvidia.
Allegrucci’s expertise is expected to:
Streamline Mass Production: His experience with large-scale chip manufacturing can help Rain AI navigate the complexities of scaling its innovative analog-digital designs.
Enhance Integration: Drawing from Apple’s ecosystem approach, Allegrucci can drive seamless hardware-software compatibility, ensuring developer adoption.
Attract Top Talent & Partners: His reputation strengthens Rain AI’s ability to recruit top engineers and secure strategic collaborations with foundries and tech firms.
With his leadership, Rain AI gains a critical edge in overcoming technical and ecosystem challenges, positioning it to deliver on its ambitious claims.
Rain AI’s Chips in AI Hardware
The dominance of Nvidia in AI hardware—particularly GPUs—has shaped the trajectory of AI over the past decade.
Nvidia GPUs power most AI models, from training to inference, leveraging decades of optimization in parallel processing and software ecosystem development (e.g., CUDA).
However, the rising demand for efficient, scalable solutions has highlighted vulnerabilities such as high costs, supply chain bottlenecks, and energy consumption.
Rain AI seeks to address these challenges with neuromorphic processing units (NPUs), which promise superior energy efficiency by mimicking brain-like computing.
Neuromorphic Processing Units (NPUs)
Rain AI’s flagship product is its Neuromorphic Processing Unit (NPU), designed to emulate the brain's structure and functionality.
“Digital Dendrites” Technology: Mimics the branching structures of neurons in the human brain to improve efficiency and scalability of AI hardware.
Digital In-Memory Computing (D-IMC): Combines memory and processing, reducing latency and energy costs associated with moving data between separate units.
RISC-V Architecture: Offers programming flexibility, making Rain AI’s NPUs adaptable to various AI workloads.
Energy Efficiency: Claimed to be 85% more efficient than Nvidia GPUs for certain tasks, with a focus on reducing power usage in AI inference.
Software Ecosystem
Rain AI has developed tools to ensure its hardware is user-friendly and attractive to developers:
Custom Quantization Algorithms: Optimize AI models to run efficiently without sacrificing accuracy.
RISC-V Programming Tools: Simplify the development process for applications running on NPUs.
Model Optimization: Techniques like Low-Rank Adaptation (LORA) for real-time training and fine-tuning.
Rain AI: Future Use Cases for Neuromorphic Chips
Included below are the potential use cases for Rain AI’s neuromorphic chips (NPUs).
It remains unknown as to whether NPUs will actually be used heavily and/or offer significant real-world advantages against alternatives (e.g. NVIDIA GPUs).
1. AI Companies (50-60%)
Rain AI’s NPUs are most likely to be adopted by AI companies for:
Reducing training costs for large-scale models like GPT.
Enabling energy-efficient inference for AI deployments.
AI companies such as OpenAI are natural early adopters due to their infrastructure-heavy operations and the need for cost-effective hardware.
Energy efficiency is increasingly critical as AI models scale, and OpenAI’s $51M pre-order highlights the importance of this sector as a revenue driver.
2. Edge AI Applications (20-25%)
Rain AI’s chips are well-suited for edge AI applications requiring real-time, low-power processing, including:
Robotics: On-device decision-making for autonomous systems.
Drones: Extending operational times via energy-efficient AI.
Autonomous Vehicles: Supporting real-time object detection and navigation.
Edge AI is a growing sector, with industries like robotics and autonomous vehicles prioritizing real-time performance and energy efficiency.
Neuromorphic chips offer distinct advantages in these use cases but will likely grow more slowly compared to the AI training market.
3. Defense & Aerospace (10-15%)
Rain AI’s NPUs could play a role in specialized applications where energy efficiency and adaptability are essential:
Military surveillance systems requiring real-time, low-power AI.
Satellites and autonomous platforms for power-constrained environments.
While well-suited for these applications, the defense and aerospace market remains niche, limiting its revenue potential relative to broader AI applications.
4. Consumer Electronics (5-10%)
Consumer electronics could leverage Rain AI’s energy-efficient capabilities in:
Wearables: Supporting health monitoring and augmented reality.
VR/AR Headsets: Providing real-time, power-efficient AI for immersive experiences.
However, this sector is dominated by custom silicon (e.g., Apple’s chips), making it challenging for Rain AI to achieve widespread adoption.
Niche applications in devices requiring advanced AI processing may still provide some opportunities.
Rain AI’s Competition in Neuromorphic Computing
Rain AI is not the only player in town with “neuromorphic computing” hardware (NPUs) – there is plenty of competition from the likes of: Intel, IBM, BrainChip, SynSense, Qualcomm, etc.
1. Pure-play Neuromorphic Companies
These companies specialize exclusively in neuromorphic hardware and solutions, targeting diverse applications ranging from edge AI to optimization.
BrainChip
Technology: Akida Neuromorphic Processor
Strengths: High-performance, ultra-low-power edge AI processing with real-time learning capabilities.
Applications: Smart sensors, robotics, IoT devices, and autonomous vehicles.
GrAI Matter Labs
Technology: NeuronFlow Technology
Strengths: Real-time processing for ultra-low power consumption.
Applications: Smart robotics, drones, and wearables requiring energy-efficient AI.
SynSense (formerly aiCTX)
Technology: Neuromorphic vision sensors and processors
Strengths: Compact and energy-efficient designs ideal for real-time vision processing.
Applications: Surveillance, AR/VR, and machine vision systems.
Innatera Nanosystems
Technology: Analog Neuromorphic Processors
Strengths: Exceptional cognitive processing capabilities at low power.
Applications: Natural language processing (NLP), auditory processing, and robotics.
Prophesee
Technology: Event-based Metavision Sensors
Strengths: Event-driven computing for faster response times and energy efficiency.
Applications: Industrial automation, health monitoring, and high-speed cameras.
MemComputing
Technology: Physics-Based Optimization Computing
Strengths: Solves complex optimization problems through neuromorphic-inspired designs.
Applications: Logistics, cryptography, and financial modeling.
Knowm
Technology: Memristor-Based Hardware Architectures
Strengths: Cutting-edge adaptability in learning and memory storage.
Applications: Adaptive AI systems and brain-machine interfaces.
2. Big Tech with Neuromorphic Initiatives
These established players leverage neuromorphic technology to enhance their existing platforms and explore next-generation computing paradigms.
Intel
Technology: Loihi 2 Processors and Hala Point System
Strengths: Advanced learning capabilities and scalable architectures.
Applications: Large-scale machine learning, robotics, and smart manufacturing.
IBM
Technology: TrueNorth Architecture
Strengths: Pioneering architecture with 1 million neurons for ultra-efficient computing.
Applications: AI research, data analytics, and simulation tasks.
Samsung Electronics
Technology: Neural Processing Units and Neuromorphic Processors
Strengths: Integration into consumer electronics for enhanced AI capabilities.
Applications: Smartphones, IoT devices, and edge computing.
SK Hynix
Technology: Ferroelectric-Based Neuromorphic Chips
Strengths: High-density memory solutions for advanced AI systems.
Applications: Memory-centric AI, data centers, and autonomous vehicles.
Qualcomm
Technology: Mobile-Focused Neuromorphic Chips
Strengths: Energy-efficient, scalable solutions for mobile platforms.
Applications: Smartphones, wearables, and mobile edge AI.
Is Rain AI Likely to Overtake NVIDIA?
No. The odds of Rain AI overtaking NVIDIA are almost zero in the next 5-10 years - and still incredibly low beyond a 10-year timeframe.
Challenges
Unproven Claims: Rain AI’s efficiency and performance improvements are theoretical, with no commercial validation.
Scaling Issues: Transitioning from prototypes to mass production is fraught with risks, especially for analog-digital chips.
Ecosystem Barriers: Nvidia’s CUDA and software libraries dominate, requiring decades to rival.
Market Resistance: Customers may hesitate to switch from a proven solution to experimental technology.
NVIDIA’s Countermoves
Exploring Neuromorphic Technologies: Nvidia’s PilotNet architecture incorporates neuromorphic principles for autonomous vehicles.
Continuous Innovation: Nvidia’s $7 billion annual R&D budget allows rapid adaptation to emerging trends.
Ecosystem Lock-in: Nvidia’s software and hardware integration ensures developer loyalty.
Probability Assessment
Short-Term (5 Years): Rain AI is unlikely to challenge Nvidia significantly (10-20% probability). Nvidia's entrenched position and ecosystem advantage are insurmountable in this timeframe.
Long-Term (10+ Years): Rain AI’s odds improve (30-40%) if it delivers on its efficiency claims, builds a robust ecosystem, and scales production effectively.
Odds of Rain AI dominating “Neuromorphic”
I asked ChatGPT & Claude the odds of Rain AI becoming the dominant player in neuromorphic computing (e.g. highest neuromorphic market share).
Odds: 30-40%
What was the rationale? Rain AI’s proprietary technology (Digital Dendrites, D-IMC), strong funding ($143M+), and experienced leadership (e.g., Jean-Didier Allegrucci) position it well for success.
However, established competitors like Intel (Loihi) and IBM (TrueNorth) have substantial R&D budgets and broader ecosystems, limiting Rain AI’s chances of monopolizing the market.
The biggest challenges for Rain AI include: scaling manufacturing and production, building a developer-friendly ecosystem comparable to Nvidia’s CUDA, and establishing market dominance over well-funded incumbents.
Estimated neuromorphic market share (2030): 25-30%. This assumes steady adoption of Rain AI’s technology and successful scaling.
Will neuromorphic computing even be relevant in the future?
Probably, but it’s good to avoid drinking too much hype juice.
Why? Forms of computing such as: neuromorphic, quantum, optical/photonic, analog, ASICs, DNA, FPGA, etc. – have been touted as transformative technology for decades.
Many problems we thought were only solvable with these types of technology ended up being solved with a combination of software optimization and/or NVIDIA GPUs.
Therefore, we shouldn’t automatically assume that GPU alternatives will overtake GPUs – especially given the rapid annual improvement of NVIDIA GPU technology and the ability of NVIDIA GPUs to complement alternative forms of compute (e.g. NVIDIA GPU/quantum computing hybrid via CUDA-Q).
GPT-4o and Sonnet 3.5 suggest 2028-2035 as the estimated timeline for mainstream neuromorphic computing adoption.
Odds of future relevance: 70-80%. ChatGPT and Claude agree that neuromorphic computing will be relevant in the future.
Odds of key AI technology: 50-60%. ChatGPT and Claude agree that neuromorphic computing is likely to become a key AI technology (not *the* key AI tech).
Odds of dominant technology: 15-20%. Neuromorphic computing excels in specific niches but GPUs/TPUs are likely to continue dominating general-purpose AI workloads.
Or will NVIDIA render neuromorphic computing obsolete?
By the time neuromorphic technologies are developed and deployed at scale, Nvidia may adapt to address many of these use cases.
Odds NVIDIA renders neuromorphic computing obsolete: ~20%.
Odds NVIDIA relegates neuromorphic to highly niche markets: ~60%.
Why?
Power Efficiency Advancements
Nvidia is heavily investing in power-efficient GPU architectures like Hopper and future generations.
Techniques such as tensor sparsity, mixed-precision arithmetic, and hardware accelerators are rapidly closing the efficiency gap between GPUs and neuromorphic chips.
Specialized Chips for Edge AI
Nvidia’s Jetson platform already targets edge AI with power-optimized GPUs.
Nvidia is likely to release custom ASICs or hybrid chips for low-power, real-time applications, directly competing with the markets neuromorphic computing aims to dominate.
Adoption of Event-Driven Architectures
Nvidia is actively exploring brain-inspired techniques like spiking neural networks and asynchronous processing.
Coupled with its software dominance (CUDA, TensorRT), Nvidia can integrate event-driven architectures while maintaining developer loyalty, reducing the need for alternative ecosystems like neuromorphic computing.
What do you think of Rain AI & neuromorphic computing?
Although Rain AI is unlikely to dethrone Nvidia in the short term, its focus on energy efficiency and neuromorphic design positions it as a complementary player in the AI ecosystem.
Rain AI faces noteworthy challenges, including: scaling production, building a developer-friendly ecosystem, and overcoming Nvidia’s entrenched market position.
However, with proprietary technologies, strong leadership, and strategic partnerships like OpenAI, Rain AI has the potential to carve out a significant niche in specialized applications.
Altman’s investment reflects a long-term vision of mitigating hardware bottlenecks, reducing costs, and fostering competition in an increasingly Nvidia-dominated market.