Back to Portfolio
CBRS

Cerebras Systems

$~100 23B market cap April 15, 2026
Cerebras Systems Inc. CBRS BUFFETT / MUNGER / KLARMAN SUMMARY
1 SNAPSHOT
Price$~100
Market Cap23B
2 BUSINESS

Cerebras has built a genuinely differentiated AI accelerator -- the wafer-scale engine represents a radical architectural departure that delivers measurable performance advantages over NVIDIA's GPUs for large model training and inference. The $20B+ OpenAI partnership could transform Cerebras from a niche UAE-dependent hardware vendor into a major AI infrastructure player. However, at the expected IPO valuation of $22-25B (~45x trailing revenue), the market is pricing in near-perfect execution on the OpenAI contract, successful customer diversification, and sustained hardware differentiation -- all while the company remains operationally unprofitable, dependent on two UAE entities for 86% of current revenue, and lacks the software ecosystem moat that protects NVIDIA's dominance. The technology is real, but the price demands faith over evidence. Patient investors should wait for post-IPO volatility to create an entry below $15B market cap, where the risk/reward becomes compelling.

3 MOAT NARROW

Wafer-scale engine (WSE-3) architecture -- 56x larger chip than NVIDIA H100, 900K AI cores, 44GB on-chip SRAM. Genuine hardware differentiation but no ecosystem lock-in comparable to NVIDIA CUDA.

4 MANAGEMENT
CEO: Andrew Feldman

Average -- raised $2.8B in equity plus $1B debt, significant dilution. OpenAI deal is capital-intensive pivot requiring infrastructure build-out.

5 ECONOMICS
-15% Op Margin
6 VALUATION
DCF Range48 - 72

Overvalued by 40-90% at IPO pricing ($22-25B) vs. base case fair value ($15-18B)

7 MUNGER INVERSION
Kill Event Severity P() E[Loss]
Extreme customer concentration -- 86% of 2025 revenue from two UAE-linked entities (MBZUAI 62%, G42 24%). One customer ecosystem, not two. HIGH - -
No ecosystem moat. NVIDIA CUDA has 4M+ developers and 15+ years of libraries. Hardware advantages can be competed away; software ecosystems endure. MED - -
8 KLARMAN LENS
Downside Case

Extreme customer concentration -- 86% of 2025 revenue from two UAE-linked entities (MBZUAI 62%, G42 24%). One customer ecosystem, not two.

Why Market Right

Export control tightening restricting UAE sales or advanced chip exports; OpenAI contract delays or renegotiation of terms; NVIDIA Blackwell Ultra or next-gen closing performance gap; Post-IPO lock-up expiry selling pressure (typically 90-180 days); Continued material weakness in internal controls leading to restatement

Catalysts

OpenAI infrastructure deployment milestones (250MW/year through 2028) validating compute-as-a-service model; New hyperscaler customer wins beyond OpenAI (Meta, Oracle, Microsoft potential); WSE-4 next-generation chip announcement expanding performance lead; Gross margin expansion above 50% demonstrating hardware economics at scale; Customer diversification -- any quarter showing top 2 customers below 70% of revenue

9 VERDICT WAIT
B- Quality Moderate -- significant cash runway post-IPO, but massive capital requirements for OpenAI infrastructure build. $1B OpenAI loan at 6% adds debt obligation.
Strong Buy$40
Buy$60
Fair Value$72

Do not buy at IPO. Monitor post-IPO trading for correction to $12-15B market cap (~$48-60/share). Set alerts for lock-up expiry, quarterly earnings showing customer diversification, and OpenAI deployment milestones.

🧠 ULTRATHINK Deep Philosophical Analysis

Cerebras Systems (CBRS) -- Deep Philosophical Analysis

The Core Question: What Would Buffett See?

Warren Buffett once said he only invests in businesses he understands. Cerebras presents an interesting test of that principle -- not because the technology is incomprehensible, but because the business model is in fundamental transition and the competitive dynamics are genuinely uncertain.

The core question is deceptively simple: Can the best chip win against the best ecosystem?

History's answer, overwhelmingly, is no. Intel's x86 architecture was technically inferior to MIPS, SPARC, Alpha, and PowerPC throughout the 1990s. It won anyway because of the software ecosystem. Microsoft Windows was technically inferior to NeXTSTEP, OS/2, and even classic Mac OS in many respects. It won because of the application ecosystem. VHS beat Betamax. The pattern is relentless: ecosystems compound, hardware advantages erode.

Cerebras has built the equivalent of a Betamax that is fifty-six times larger and demonstrably better at its core task. The WSE-3 is a genuine marvel of engineering -- 4 trillion transistors on a single wafer, 900,000 AI cores, memory bandwidth that makes NVIDIA's H100 look like it is communicating through a straw. When Cerebras runs Llama 4 Maverick at double the speed of NVIDIA's newest Blackwell system, that is not marketing -- it is physics.

But NVIDIA has CUDA. Four million developers. Fifteen years of optimized libraries. Every ML framework, every cloud provider, every enterprise AI team has built their muscle memory around NVIDIA's ecosystem. Switching from CUDA is not a technical decision -- it is an organizational transformation. And organizations resist transformation the way water resists being pushed uphill.

Moat Meditation: The Paradox of Radical Innovation

Charlie Munger would immediately identify the core paradox: Cerebras' greatest strength is also its greatest vulnerability. The wafer-scale approach is so radically different that it creates both a technological moat (no one else can easily replicate it) and a commercial disadvantage (no one else's software ecosystem supports it natively).

This is the classic innovator's dilemma viewed from the innovator's side. Cerebras has built a technology that is unambiguously superior for a specific class of workloads -- training and running very large language models. But superiority in a vacuum means nothing. The question is whether that superiority is large enough, and persistent enough, to overcome the gravitational pull of the incumbent ecosystem.

The honest answer is: probably not for general-purpose AI compute, but possibly for the specific niche of ultra-large model inference and training. And that niche, driven by the scaling laws of AI, may be growing fast enough to sustain a substantial business.

The OpenAI deal is the test case. If Cerebras can demonstrate that for OpenAI's specific workloads -- serving AI-powered coding tools, running inference on frontier models -- the WSE architecture delivers meaningfully better performance per dollar, then the ecosystem disadvantage matters less. OpenAI does not care about CUDA compatibility; they care about tokens per second per dollar.

The Owner's Mindset: Would Buffett Hold This for Twenty Years?

No. And he would be transparent about why.

First, the customer concentration. Buffett has consistently avoided businesses dependent on a single customer or government entity. Eighty-six percent of revenue from two UAE-linked organizations is not a business -- it is a contract. Contracts end. Governments change priorities. The UAE's AI ambitions could shift, slow, or be redirected to domestic alternatives. Cerebras has no control over this.

Second, the capital intensity. Buffett loves asset-light businesses with high returns on capital. Cerebras is pivoting from an already capital-intensive semiconductor design business to an even more capital-intensive cloud infrastructure business. The OpenAI deal requires building 750 megawatts of data center capacity. That is billions of dollars in physical infrastructure -- real estate, power systems, cooling, networking -- none of which constitutes a competitive moat. Any well-funded competitor can build a data center.

Third, the competitive dynamics. The AI accelerator market is attracting the most capable competitors in technology: NVIDIA with its $3+ trillion market cap and $30B+ R&D budget, Google with TPUs, Amazon with Trainium, Microsoft with Maia, and dozens of well-funded startups. When the richest companies in the world are competing for the same market, excess returns are competed away. This is the opposite of Buffett's preferred hunting ground -- boring businesses with few competitors.

Fourth, the valuation. At $22-25 billion, Cerebras is priced for perfection in an imperfect world. Buffett's margin of safety principle demands buying excellent businesses at fair prices, or fair businesses at excellent prices. Cerebras is an unproven business at an excellent-business price.

Risk Inversion: What Could Destroy This Business?

Inverting the question, as Munger would insist, reveals the fragility:

  1. Export controls tighten. One executive order restricting advanced AI chip sales to the UAE eliminates 86% of current revenue overnight. This is not a theoretical risk -- it nearly happened during the CFIUS review.

  2. OpenAI renegotiates. The $20B deal is a commitment, not a guarantee. If OpenAI's financial position weakens, or if NVIDIA offers competitive pricing, the deal could be restructured downward. OpenAI has leverage -- Cerebras has dependency.

  3. NVIDIA closes the gap. NVIDIA's Blackwell Ultra and Rubin architectures are specifically targeting the performance gaps Cerebras exploits. If NVIDIA delivers 80% of WSE performance with 100% CUDA compatibility, the value proposition for switching disappears.

  4. Execution failure on infrastructure. Building and operating 750MW of data center capacity is a fundamentally different business than designing chips. Many companies have failed at this transition. If Cerebras stumbles on data center build-out, the OpenAI deal becomes a liability rather than an asset.

  5. The AI winter scenario. If AI investment cycles downward -- and all technology investment cycles eventually do -- Cerebras faces a perfect storm of declining demand, fixed infrastructure costs, and concentrated customer exposure.

Valuation Philosophy: The Price of Dreams

Seth Klarman would note that at 45x trailing revenue with no operational profitability, the market is not valuing Cerebras as a business. It is valuing Cerebras as an option on a future where AI compute demand is insatiable and Cerebras captures a meaningful share of it. Options can be valuable, but they require appropriate pricing.

The appropriate price for this option, given the risks enumerated above, is substantially below the IPO range. A value investor requires compensation for uncertainty, and Cerebras offers more uncertainty per dollar of revenue than almost any company at this valuation level. The gross margins are below industry norms. The customer concentration is above any reasonable threshold. The competitive moat is narrow and time-limited.

At $10-12 billion, the risk/reward tilts favorably. At that price, you are paying 15-18x forward revenue for a company with 76% growth, a $20B backlog from OpenAI, and genuinely differentiated technology. The customer concentration is still terrifying, but at that valuation, you have enough margin of safety to absorb a significant setback.

At $22-25 billion, you are paying for the dream and accepting the nightmare scenarios for free.

The Patient Investor's Path

The disciplined approach is clear:

  1. Do not buy at IPO. The price does not compensate for the risk.
  2. Watch the first two quarters. Look for evidence of customer diversification, gross margin expansion, and OpenAI deployment progress.
  3. Wait for the inevitable correction. Every AI hardware IPO of the past five years has experienced a 30-50% drawdown within twelve months. Lock-up expiry (90-180 days post-IPO) often triggers significant selling.
  4. Accumulate below $15B market cap ($60/share), where the risk/reward becomes favorable.
  5. Size the position modestly (2-4% of portfolio). Even at a good price, the concentration risk and competitive uncertainty warrant a smaller position than a wide-moat business.

The WSE is a remarkable piece of engineering. Andrew Feldman and his team have built something genuinely novel. But remarkable engineering and remarkable investments are different things. The patient investor respects the technology while demanding the price reflect the risks.

Cerebras may indeed become a major force in AI compute. But the wise investor waits for the market to offer that possibility at a price that acknowledges it is a possibility, not a certainty.

Executive Summary

Cerebras Systems is building genuinely differentiated AI compute hardware -- wafer-scale engines (WSE) that use an entire 300mm silicon wafer as a single processor. The WSE-3 contains 4 trillion transistors and 900,000 AI-optimized cores, making it 56x larger than NVIDIA's H100. The company has achieved impressive revenue growth ($24.6M in 2022 to $510M in 2025) and recently turned GAAP profitable. However, the investment case is clouded by extreme customer concentration, an unproven competitive moat against NVIDIA's ecosystem lock-in, and a valuation that demands near-perfect execution.

Verdict: WAIT -- genuinely interesting technology, but the risk/reward at IPO pricing ($22-25B) does not offer adequate margin of safety. Monitor for a post-IPO correction to below $15B market cap.


Phase 1: Risk Assessment (Kill Screen)

1.1 Customer Concentration -- CRITICAL RISK

This is the single most important risk factor and nearly disqualifying on its own:

Year G42 % of Revenue MBZUAI % of Revenue Top 2 Customers Combined
2022 ~85%+ Minimal ~85%+
2023 83% Minimal ~83%+
H1 2024 87% Minimal ~87%
2025 24% 62% 86%

The "diversification" from 2024 to 2025 is illusory. Revenue shifted from G42 to MBZUAI (Mohamed bin Zayed University of AI) -- but both entities are UAE-linked, sovereign-adjacent organizations. Cerebras effectively has ONE customer ecosystem (UAE AI infrastructure) accounting for 86% of revenue.

As of December 31, 2024, G42 represented 91% of total accounts receivable. This is a company where one customer's payment timing determines whether quarterly cash flow is positive or negative.

1.2 CFIUS / Geopolitical Risk -- HIGH

  • Cerebras withdrew its original IPO filing in October 2025 due to CFIUS review of the G42 relationship.
  • Resolution required restructuring G42's equity stake to non-voting shares, removing governance influence.
  • Export controls remain an ongoing risk -- WSE chips are advanced AI accelerators subject to U.S. export licensing.
  • Any tightening of U.S.-UAE tech relations or further export restrictions could devastate the customer base.
  • The Trump administration's fluctuating trade policies add uncertainty to export licensing.

1.3 Competitive Risk -- NVIDIA Ecosystem Lock-In -- HIGH

NVIDIA's competitive moat is NOT primarily hardware -- it is CUDA, the software ecosystem:

  • CUDA: ~4 million developers, 15+ years of libraries, frameworks, and tooling
  • DGX/HGX ecosystem: Standardized, well-understood deployment
  • InfiniBand/NVLink: Mature multi-chip interconnect
  • Supply chain: NVIDIA ships millions of GPUs; Cerebras ships hundreds of systems

Cerebras claims 97% less code required for LLM training vs. GPUs. Even if true, enterprise AI teams have invested years building CUDA-optimized pipelines. Switching costs are enormous.

The WSE is a better chip. NVIDIA has a better ecosystem. History consistently shows that ecosystems beat chips (x86 vs. superior RISC architectures, Windows vs. Mac in the 1990s, Android vs. better mobile OSes).

1.4 Manufacturing Risk -- MODERATE

  • Cerebras depends entirely on TSMC for WSE fabrication
  • Wafer-scale manufacturing is inherently more challenging than cutting individual dies
  • Single-source dependency with no alternative fab capable of this process
  • TSMC capacity allocation favors NVIDIA (their largest customer) and Apple over Cerebras

1.5 Financial Control Risk -- ELEVATED

The S-1 discloses material weaknesses in internal controls over financial reporting. For a company seeking a $22-25B IPO valuation, this is a serious governance concern and could lead to restatements or reporting delays post-IPO.

1.6 Kill Screen Verdict

The customer concentration alone would normally be disqualifying. Two entities within the same UAE sovereign ecosystem representing 86% of revenue is not a commercial business -- it is a government contract. However, the OpenAI partnership changes the forward-looking picture enough to keep this in analysis rather than immediate rejection.


Phase 2: Financial Analysis

2.1 Revenue Trajectory

Year Revenue YoY Growth
2022 $24.6M --
2023 $78.7M 220%
2024 $290.3M 269%
2025 $510.0M 76%

Revenue growth is exceptional but decelerating (from 269% to 76%). The 2025 figure of $510M is largely driven by UAE infrastructure deployments and may not represent diversified commercial demand.

2.2 Profitability

Metric 2024 2025
GAAP Net Income (Loss) ($481.6M) $237.8M
Non-GAAP Net Income (Loss) ($21.8M) ($75.7M)

The GAAP profitability in 2025 requires careful scrutiny:

  • The $237.8M GAAP net income sits alongside a $75.7M non-GAAP net loss
  • The ~$313M gap between GAAP and non-GAAP is likely related to mark-to-market gains on warrants, fair value adjustments, or one-time items
  • On an operational basis, Cerebras is still losing money
  • Stock-based compensation is a material expense excluded from non-GAAP figures

2.3 Gross Margins

Period Gross Margin
2022 11.7%
2023 33.5%
H1 2024 41.1%

Gross margin improvement from 11.7% to 41.1% is encouraging but still well below semiconductor peers:

  • NVIDIA: ~73-76% gross margin
  • AMD: ~50-53% gross margin
  • Cerebras at ~41%: Below industry norms for "differentiated" chip companies

The lower margins reflect (a) wafer-scale manufacturing costs, (b) volume discounts to concentrated customers, and (c) system-level sales that include non-chip components.

2.4 Operating Expenses

  • R&D: >50% of revenue (H1 2024), appropriate for a pre-profit chip company
  • G&A + Sales/Marketing: <15% of revenue in aggregate
  • Total operating expenses significantly exceed gross profit on a non-GAAP basis

2.5 Cash Position & Funding

Round Date Amount Valuation
Series G Sep 2025 $1.1B $8.1B post-money
Series H Feb 2026 $1.0B $23B post-money
OpenAI Loan Jan 2026 $1.0B 6% annual interest
IPO (expected) Q2 2026 ~$2.0B $22-25B

Total capital raised to date: ~$2.8B across eight rounds, plus the $1B OpenAI loan. Post-IPO, Cerebras will have access to a Morgan Stanley revolving credit facility of up to $850M.

2.6 Cash Burn Assessment

With non-GAAP losses of ~$76M in 2025 and massive infrastructure build-out required for the OpenAI contract, cash burn will accelerate. The OpenAI deal requires Cerebras to deploy 750MW of compute infrastructure -- data center build-out at this scale costs billions. The $1B OpenAI loan and IPO proceeds are specifically targeted at this deployment.


Phase 3: Moat Assessment

3.1 Technology Moat -- NARROW but GENUINE

The WSE architecture represents real engineering innovation:

Metric WSE-3 NVIDIA H100 WSE-3 Advantage
Transistors 4 trillion 80 billion 50x
AI Cores 900,000 ~17,000 53x
On-Chip SRAM 44 GB 50 MB 880x
Memory Bandwidth 21 PB/s 3.35 TB/s ~6,300x
Chip Area 46,225 mm2 814 mm2 56x

These are not incremental improvements -- they represent a fundamentally different architectural approach. The WSE eliminates multi-chip communication bottlenecks by keeping everything on a single wafer.

Performance claims: Cerebras delivered Llama 4 Maverick inference at 2,500 tokens/sec per user -- more than double NVIDIA's DGX B200 Blackwell on the same 400B parameter model.

3.2 Ecosystem Moat -- NONE

This is where the bull case breaks down:

  • Software ecosystem: Cerebras uses PyTorch/TensorFlow compatibility layers, but lacks anything comparable to CUDA's depth
  • Developer community: Tiny compared to NVIDIA's 4M+ CUDA developers
  • Third-party support: Limited ISV integrations vs. NVIDIA's universal support
  • Training data: Most ML teams have optimized their training pipelines for NVIDIA GPUs

3.3 Switching Costs -- LOW TO MODERATE

  • For inference workloads: Relatively easy to switch (API-level compatibility)
  • For training workloads: Moderate switching costs (need to revalidate training runs)
  • For enterprise deployments: Low (customers are generally cloud/service buyers, not hardware owners)

3.4 Overall Moat Assessment: NARROW

Cerebras has a genuine technology advantage in raw chip performance, but technology moats are the weakest form of competitive advantage. They can be eroded by (a) NVIDIA improving its own architecture, (b) custom ASICs from hyperscalers (Google TPU, Amazon Trainium, Microsoft Maia), or (c) new entrants. The WSE approach is defensible for 3-5 years but not 10-20 years. Without building a software ecosystem to match CUDA, the hardware advantage alone is insufficient for a wide moat.


Phase 4: Synthesis & Valuation

4.1 The OpenAI Partnership Changes Everything -- Maybe

The $20B+ OpenAI deal (potentially $30B with extensions through 2030) transforms Cerebras from a UAE-dependent niche player into a potential major AI infrastructure provider. Key terms:

  • 750MW of Cerebras compute for OpenAI through 2028, with option for 1.25GW more through 2030
  • $1B loan from OpenAI at 6% interest
  • Warrants for 33.4M Class N (non-voting) shares at $0.00001/share
  • If total expenditures reach $30B, OpenAI acquires ~10% stake in Cerebras

Bull case: OpenAI becomes 50%+ of revenue by 2027, replacing UAE concentration with U.S. concentration from the world's leading AI company.

Bear case: This is a compute services contract, not a chip sale. Cerebras must build and operate data centers -- fundamentally different from selling hardware. Capital intensity is enormous, and Cerebras becomes a quasi-cloud provider competing with AWS, Azure, and GCP.

4.2 Valuation Framework

At $23B IPO valuation on $510M 2025 revenue:

  • Price/Sales: ~45x
  • Price/Non-GAAP Earnings: Negative (still losing money operationally)
  • EV/Revenue: ~43x (adjusting for cash/debt)

Peer comparison:

Company P/S (Forward) Gross Margin Revenue Growth
NVIDIA ~30x 73-76% ~50-60%
AMD ~10x 50-53% ~15-20%
Cerebras (IPO) ~45x ~41% 76% (decelerating)

Cerebras is priced at a premium to NVIDIA despite:

  • Much lower gross margins
  • Extreme customer concentration
  • No profitability on an operational basis
  • Unproven at scale in competitive markets
  • No ecosystem moat

4.3 What the Market is Pricing In

At $22-25B, the market assumes:

  1. OpenAI deal executes fully ($20B+ over 3+ years)
  2. Revenue reaches $2-3B by 2028
  3. Gross margins expand to 50%+ at scale
  4. Customer diversification beyond UAE + OpenAI
  5. No CFIUS/export control setbacks
  6. NVIDIA doesn't close the performance gap

This is an aggressive set of assumptions with limited margin of safety.

4.4 Fair Value Estimation

Bear case ($8-12B): UAE revenue stagnates, OpenAI deal under-delivers, margins stay compressed. 15-20x forward revenue on $600M 2026E revenue.

Base case ($15-18B): OpenAI ramp proceeds but slower than planned, moderate customer diversification, margins reach 45%. 20-25x forward revenue on $700-800M 2026E revenue.

Bull case ($25-35B): Full OpenAI execution, new hyperscaler wins, margins reach 55%, revenue $1B+ by 2027. 30-35x forward revenue.

4.5 Entry Price Framework

Given the IPO is expected at $22-25B market cap:

Level Market Cap Per Share (est. ~250M diluted) Rationale
Strong Buy <$10B <$40 >50% discount to base case, true margin of safety
Accumulate $12-15B $48-60 15-20% discount to base case
Fair Value $15-18B $60-72 Base case valuation
Overvalued >$22B >$88 IPO pricing, limited upside

The IPO price of $88-100/share (implied by $22-25B valuation) is above our fair value range.


Key Positives

  1. Genuinely differentiated technology -- WSE is not an incremental improvement; it is a fundamentally different approach to AI compute
  2. OpenAI partnership -- Transformative contract that could drive $20B+ in revenue over 5 years
  3. Revenue momentum -- $24.6M to $510M in three years is exceptional growth
  4. TSMC partnership -- Validated wafer-scale manufacturing at the world's best foundry
  5. Strong investor syndicate -- Tiger Global, Fidelity, AMD, Benchmark, Coatue
  6. Inference performance -- 2x+ advantage over NVIDIA Blackwell on large models
  7. Expanding customer base -- IBM, Meta, Mistral AI, Hugging Face, Oracle Cloud

Key Negatives

  1. 86% revenue from two UAE-linked entities -- Existential concentration risk
  2. No operational profitability -- Non-GAAP net loss of $75.7M in 2025
  3. No ecosystem moat -- Hardware advantage without software lock-in is fragile
  4. Material weaknesses in internal controls -- Governance concern at IPO
  5. Capital-intensive pivot -- OpenAI deal requires becoming a cloud infrastructure operator
  6. Valuation premium -- 45x P/S exceeds NVIDIA despite lower quality metrics
  7. Single-source manufacturing -- TSMC dependency with no alternative
  8. CFIUS overhang -- Export controls could tighten at any time

Final Verdict

WAIT at IPO pricing ($22-25B). The technology is genuinely impressive and the OpenAI partnership is transformative, but the risk/reward does not compensate for extreme customer concentration, no operational profitability, and a valuation premium to NVIDIA. Cerebras needs to prove it can (a) diversify revenue beyond UAE + OpenAI, (b) achieve sustainable gross margins above 50%, and (c) demonstrate the OpenAI infrastructure build-out is capital-efficient.

Target entry: Accumulate below $15B market cap ($60/share). Strong Buy below $10B ($40/share). Given the typical post-IPO volatility for AI hardware companies and the specific risks here, a correction to the $12-15B range within 6-12 months of listing is plausible.

Monitor triggers:

  • Q1/Q2 2026 earnings showing customer diversification progress
  • OpenAI infrastructure deployment milestones
  • Any export control policy changes
  • Gross margin trajectory toward 50%+
  • Resolution of material weakness in internal controls

Analysis based on S-1 filing data, public sources, and first-principles valuation. No analyst reports used as primary inputs. All financial data sourced from SEC filings and company disclosures.

=== VERDICT: CBRS | WAIT | SB:$40 | Acc:$60 | Expected_IPO_Range:$88-100 ===