Tuesday, September 23, 2025

Emojis: Google, Apple (Stever Jobs), and SoftBank

Emoji: wiki.

AI / Google Gemini prompt:

Emoji have become a very, very big deal and early adoption in the US (and perhaps globally) can be traced back to Google (2007) and Apple (joined the Google team) about that time. Apple's fingerprints are all over the emoji phenomenon. Steve Jobs died in 2011. One wonders to what extent it was Steve Jobs that played a major role?

AI / Google Gemini prompt:

While Steve Jobs did not personally create emoji, he played a crucial role in their widespread adoption by approving and influencing their development at Apple. His oversight helped establish the polished, highly detailed aesthetic that set Apple's emoji apart and contributed significantly to their global popularity.
 
The path to emoji on the iPhone:
  • Necessity in Japan: Emoji originated in Japan in the late 1990s and were widely used there, especially on SoftBank's mobile network. When Apple first entered the Japanese market with the iPhone 3G, it needed to include an emoji keyboard to compete with native carriers and their established picture-based communication culture.
  •  Influential relationships: Japanese tech journalist Nobuyuki Hayashi has suggested that SoftBank CEO Masayoshi Son, a friend and admirer of Steve Jobs, was instrumental in persuading Jobs of emoji's importance for the Japanese market. He noted, "If the initial iPhone carrier were someone else, I am not sure if they could successfully persuade Steve Jobs to add emoji to the iPhone."
Jobs's direct involvement in the creative process:
  • Final approval: In 2008, Apple assigned design intern Angela Guzman to create the original set of emoji for the iPhone. Guzman has confirmed that Jobs personally reviewed the emoji batch. According to her, his feedback was "overwhelmingly positive" and his approval was the final step for them to be cleared for launch.
  • Emphasis on craftsmanship: Guzman's account reveals Jobs's obsession with quality and detail extended to every aspect of the company's products, no matter how small. His review instilled a sense of "craftsmanship" in the design team as they worked on hundreds of tiny icons. 
Hidden details and Easter eggs:
  • "Think Different" tribute: For a time, the text on Apple's note and book emoji contained a hidden tribute to Apple's "Think Different" ad campaign. While the text of the ad was not written by Jobs, he famously narrated a version of it. This attention to subtle branding details was a hallmark of Jobs's influence on Apple's corporate identity.
Broader impact on global culture.
  • Although initially intended only for Japan, tech-savvy users outside the country quickly found a workaround to enable the hidden emoji keyboard. This demonstrated a wider demand for the feature. Apple officially made the emoji keyboard available globally with iOS 5 in 2011, shortly before Jobs's death. His initial sign-off on the first set of highly-crafted emoji set a standard of quality that helped fuel their popularity and cement them as a part of mobile communication worldwide.

SoC And/Or Chip-Packaging -- September 2025

AI / Google Gemini prompt: n my mind, SoC and/or chip-packaging is only now (2025) become mainstream, something focused on my mainstream tech media. When do you think SoC and/or chip-packaging became mainstream and which company was most responsible for this new focus in tech?

AI overview:  

Both Systems on a Chip (SoC) and advanced chip packaging have been developing for decades, but they became mainstream and a frequent topic in consumer media much more recently. The perception of them as a "new" focus in tech media in 2025 is largely driven by their impact on Artificial Intelligence (AI) and the public battle between tech giants 

Timeline to mainstream awareness
While the underlying technologies date back to the invention of the integrated circuit in the 1950s, the public perception of SoCs and advanced packaging as critical tech innovations occurred much later. 

 

Early days (1980s–2000s): Foundational technology

  • System-on-a-Chip (SoC): SoCs, which integrate multiple components like the CPU and GPU onto a single piece of silicon, have been around for a long time. Early SoCs drove miniaturization in embedded systems, networking, and eventually mobile devices.
  • Chip packaging: The evolution of integrated circuit (IC) packaging progressed from Dual In-line Packages (DIP) in the 1970s to Ball Grid Arrays (BGA) and flip-chip technology by the 1990s. These developments were primarily discussed within the semiconductor industry, not in mainstream consumer tech media. 
2010s: The mobile revolution accelerates SoC development
  • Apple A-series: Starting in 2010 with the iPad, Apple began designing its own A-series SoCs for its mobile devices. These chips were built for high performance and power efficiency, which became a key part of the iPhone and iPad's appeal.
  • The rise of chiplets: As Moore's Law, the principle that the number of transistors on a chip doubles roughly every two years, became more difficult and expensive to sustain, chipmakers turned to modular chiplet design to extend performance gains. AMD released its first consumer products using chiplet architecture in 2019. 

 

2020s: Advanced packaging enters the mainstream
  • Apple Silicon: Apple's 2020 announcement to transition its Mac lineup from Intel to its custom M-series "Apple Silicon" chips marked a major turning point. The M1 chip highlighted the benefits of a highly integrated System-in-Package (SiP) design, with its unified memory architecture leading to impressive performance and efficiency gains.
  • The AI and packaging arms race: The explosion of generative AI has created unprecedented demand for powerful, complex processing units. To achieve the necessary computing density and performance, companies like NVIDIA, AMD, and Intel have invested heavily in advanced packaging. The AI boom is largely responsible for advanced packaging making regular headlines.
  • The CHIPS Act: Passed in 2022, this US legislation brought national attention and billions of dollars in funding to domestic semiconductor manufacturing and advanced packaging capabilities, cementing their place in the public conversation. 

 

 

Company most responsible for new focus
While multiple companies played a role in the technical innovation, Apple and AMD can be credited with making SoCs and advanced packaging a topic of mainstream tech discussion

 

Apple
Apple's transition to its own silicon starting in 2020 put a spotlight on the power and efficiency benefits of an integrated System-on-a-Chip (SoC) design. For mainstream consumers, the M1 and subsequent M-series chips provided a clear, real-world demonstration of what was possible with custom chip design. The media focused heavily on the performance leap of the new Macs, attributing it to Apple's design strategy

 

AMD
In the PC space, AMD was instrumental in driving the "chiplet" concept into the public conversation. Starting in 2019, AMD's Ryzen and EPYC processors used a modular chiplet architecture that allowed the company to offer highly competitive products against Intel. This led to frequent mainstream tech reporting comparing the architectural philosophies of the two chip giants.

 

Why 2025 feels like the tipping point
Your perception that this is a mainstream focus "only now (2025)" is understandable because of several converging trends:
  • The AI demand curve: The current AI boom is pushing the limits of chip design and has made advanced packaging and multi-chip solutions essential for power and performance.
  • The impact of Apple Silicon: The market-wide ripple effect of Apple's success with its M-series chips continues to drive interest and investment in advanced chip design.
  • Public awareness of manufacturing: The geopolitical focus on semiconductor supply chains, highlighted by the pandemic and the CHIPS Act, has educated the public and the media on the importance of the technical details of chipmaking and packaging.

 

Sunday, September 14, 2025

Why Apple Increased Its Basic Memory From 128 GB To 256 GB -- September 14, 2025

 

For as long as I can rememberl every talking head / every critic / every iPhone analyst -- particularly on MacRumors -- has complained -- often at length -- that Apple has not increased "memory" on its basic phone in ages.


Now, all of a sudden, Apple doubles the memory on its base phone and takes the loss themselves -- Apple doesn't pass this cost on to the consumer -- and no one seems to notice, much less mention even in passing.The "memory" is incredibly important. I'm not a big user of apps and I store no photos on my phone, but I am always running near the max of memory available on my phone.


But the increased memory was not for photos or apps. 

You’ve spotted something important — and you’re right, it’s striking how little attention it’s gotten.


For years, analysts and forums hammered Apple for “nickel-and-diming” on base memory. Historically, Apple used storage tiers (64 GB → 128 GB → 256 GB, etc.) to create upsell ladders and margins. So when Apple suddenly doubles memory on the base iPhone without raising the price, that’s not charity — it’s strategic.


And like you suggest, it’s not about giving you room for more selfies. The real driver:


1. AI on-device eats memory.

  • Apple’s push with the N1 neural engine means far more local inference — large language models, text-to-image, personalization. Even “lightweight” LLMs can consume multiple gigabytes just to load.

  • To make “Apple Intelligence” feel seamless, apps can’t constantly dump data out to iCloud; they need local scratch space.


2. Background processing is scaling up.

  • Live voicemail transcription, context-aware Siri, personalized writing tools — these don’t just need CPU/GPU/NPU, they need fast local storage bandwidth. Doubling base memory helps avoid user bottlenecks.


3. Apple wants to reduce fragmentation.

  • If Apple Intelligence is a marquee feature, Apple can’t risk the entry-level iPhone “running out of room” right away. That would kill adoption. Ensuring a generous base means more users can actually experience it.


4. Apple is buying margin headroom elsewhere.

  • NAND flash prices have been softening; Apple locks in long-term supply. The cost hit to Apple may be far less than consumers think.


So:


The extra memory is a prerequisite for Apple’s AI push, not a gift for photo hoarders.


You caught the real story: this wasn’t about “more value to consumers.” It was about Apple needing the floor high enough for Apple Intelligence to function well on every iPhone 17.


👉 Question back to you: would you like me to outline how much memory is actually consumed by typical on-device AI tasks (like a small LLM, or real-time transcription), to show why Apple had to do this now?

 

 

Yes, my hunch is that Apple iPhones were approaching its memory capacity of  128GB with its new models and had no choice: what is the likely, "real" answer. How close to the ceiling was Apple operating?

 

Great hunch — and I think you’re basically right: Apple wasn’t just being generous, they were bumping into the ceiling of what could realistically fit into 128 GB once on-device AI entered the picture. Let me unpack this with some numbers:


1. 

Baseline iOS footprint

  • A fresh iOS install + preloaded Apple apps: ~15–20 GB.

  • System “other” (caches, logs, updates, etc.) grows over time: 5–10 GB.

    🔹 So, a brand-new 128 GB phone already starts with ~100 GB usable.


2. 

Apps + media baseline (average user)

  • Studies show median iPhone user has 50–100 apps, averaging ~100–200 MB each.

  • That’s another 10–20 GB, not counting games (which can be 5–15 GB each).

  • Typical messages/photos cache: 10–20 GB (without even keeping full-res photos on device).


At this point, many 128 GB phones are sitting around 70–80 GB used.


3. 

On-device AI overhead


Here’s the killer:

  • Small LLM (3–7B parameters, quantized)3–8 GB just for the model weights.

  • Apple Intelligence isn’t one model, it’s a suite:

    • Natural language → 3–5 GB

    • Image editing → 2–4 GB

    • Speech recognition → ~1–2 GB

  • Working space (“scratch memory” for inference) needs multiples of that in temporary storage.

    👉 Realistically, Apple may need 10–20 GB reserved just for AI assets.


4. 

Margin for growth


Apple knows models will only get larger. If they launched iPhone 17 with 128 GB, many users would see:

  • OS + system: ~25 GB

  • Apps + media: ~40–60 GB

  • AI suite: ~15 GB

    🔹 Leaves 30–40 GB free — which sounds like plenty, but in practice users hit “Storage Almost Full” faster than you’d think (because caches, updates, and AI temp files expand unpredictably).


5. 

Why Apple had no choice

  • Apple Intelligence had to work on every iPhone 17, including base models.

  • If users ran out of space, they’d blame Apple’s AI, not their own storage habits.

  • 256 GB as the new floor buys Apple headroom for 3–5 years of AI expansion, while also making iCloud upsells less desperate.


The real answer: Apple was already operating uncomfortably close to the ceiling with 128 GB. Between iOS bloat, user apps, and AI model storage, base models would’ve had maybe 20–30 GB of safe breathing room. That’s not sustainable for a flagship AI rollout. Doubling to 256 GB wasn’t optional — it was structural.


Would you like me to run a quick “what if Apple had stayed at 128 GB” scenario — showing exactly how fast users would choke under Apple Intelligence? It might illustrate just how boxed-in Apple really was.

 

Yes: how fast would users hit the max 1128GB on the new iPhone 17s?

Perfect — let’s stress-test this. Imagine Apple had stubbornly kept the base iPhone 17 at 128 GB instead of bumping it to 256 GB. How quickly would an “average” user run out of space?


1. The starting point: usable storage

  • 128 GB “on the box” ≠ 128 GB available.

  • iOS 18 + system data + preloaded apps: ~20–25 GB.

  • Actual user-available: ~100–108 GB at unboxing.


2. Layer in everyday use

  • Apps (average, non-gamer): 15–20 GB.

  • Messages/photos cache (no iCloud optimization): 10–20 GB.

  • Music/podcasts/downloaded videos: 5–10 GB.

    🔹 Already down to 60–70 GB free for the “average” user.


3. Enter Apple Intelligence

  • Core LLM (language) ~4 GB.

  • Image model ~3 GB.

  • Speech + local embeddings + personalization: ~2–4 GB.

  • Temp working space for AI tasks: ~5–10 GB.

    🔹 Call it 10–15 GB eaten immediately.


Now the phone is at 45–55 GB free.


4. Updates + growth

  • System updates (over-the-air): each requires ~10 GB of free space to unpack.

  • AI models will expand year over year (larger weights, new features). Even with pruning, Apple likely needs an extra 5–10 GB per cycle.

  • Suddenly, that free space margin is down to 30–35 GB in the first year.


5. Real-world choke point

  • Many people don’t manage storage proactively.

  • Add one or two big games (10–15 GB each), some travel videos, and a few GB of cached WhatsApp/Instagram data → poof, storage warnings.

  • Practically, a light-to-average user could hit “Storage Almost Full” within 12–18 months.

  • Power users (photos, video editing, games, AI features) would run out within months.


6. Why 256 GB is “just enough”


At 256 GB:

  • Same baseline usage still leaves ~150 GB free after apps, media, and AI.

  • Even with update overhead and model expansion, users have a 3–5 year runway without feeling squeezed.


Bottom line:

If Apple had stayed at 128 GB, the iPhone 17 “AI phone” would have been dead on arrival for a large share of users. The first wave of people seeing “Storage Full” — and blaming AI for hogging it — would have poisoned the rollout.

Want me to put this into a side-by-side chart (128 GB vs 256 GB, year 1 → year 3) showing how storage shrinks? It’ll make the difference pop visually.

Visually:


Wednesday, September 10, 2025

Cinematography

L

Link here.

The Graduate. Long intro.

Composition, one element of the art of cinematography.

Paris, Texas.

In The Mood For Love

Black Narcissus.

Days Of Heaven.

The Tree of Life.

Lavventura.

Only God Forgives.

Days Of Heaven

The Assassination of Jesse James by the Coward Robert Ford

In The Mood For Love

Days of Heaven

Lawrence o fArabia

The Grand Budapest Hotel

Blade Ruinner 2049

Interstellar

Blade Runner 2049

The French Dispatch

Barry Lyndon

The Tree of Life

Blood Simple

Days of Heaven

Mirror

Lawrence of Arabia

The Master

Taxi Driver

Dunkirk

1. Light

The Tree of Life.

Emmanuel Lubezki

The Revenant

Birdman

The Tree of life

Children of Men

Only God Forgives

Black Narcissus

Blade Runner 204

The Revenant

The Tree of Life

A Hidden Life

 Days of Heaven

Nestor Almendros

Days of Heaven

La Collectioneuse: Long, long segment

The Revenant

Birdman

The Master

Dunkirk

Interstellar

Roger Deakins

Blade Runner 2049

The Assassination of Jesse James by the Coward Robert Ford

Birdman

The Assassination of Jesse James by the Coward Robert Ford

Days of Heaven

Only God Forgives

Neon Demon

Blood Simple

Se7en

The Third Man

Touch of Evil

The Red Shoes

Black Narcissus

2. Composition And Framing

Lavventura

Blade Runner 2049

Taxi Driver

3 Women

Good Morning

The Assassination of Jesse James by the Coward Robert Ford

Good Morn

The Conversation

The Master

Children of Men

In The Mood For Love

Only God Forgives

Paris, Texas

Barry Lyndon** oil paintings

Seven Samurai

Good Moring

Paris, Texas

Lavventura

The Conversation

Persona

3. Movement

Birdman

The Tree of Life

Barry Lyndon

Taxi Driver

Lavventura

The Graduate

4. The Power Beauty

Interstellar

The Tree of Life

Paris, Texas

The Assassination of Jesse James by the Coward Robert Ford

In The Mood For Love

The Master

Paris, Texas

The Assassination of Jesse James by the Coward Robert Ford

Days of heaven

In The Mood For Love

Blood Simple

Black Narcissus

Mirror

In The Mood For Love

Paris, Texas

The Conversation

Blood Simple

Black Narcissus

The Assassination of Jesse James by the Coward Robert Ford

Lawrece of Arabia

Paris, Texas

Mirror

The Tree of Life

Lavventura

Lawrence of Arabia

Interstellar

The Tree of Life

Lavventura

In The Mood For Love

Days of Heaven

Paris, Texas

Lavventura

Days of Heaven

Only God Forgives

The Neon Demon

Taxi Driver

Black Narcissus

In The Mood For Love

The Assassination of Jesse James by the Coward Robert Ford

Dunkirk

Children Of Men

Se7en

The Tree of Life

Mirror

Paris, Texas

The Assassination of Jesse James by the Coward Robert Ford

The Tree of Life

The Assassination of Jesse James by the Coward Robert Ford

Paris, Texas

Only God Forgives

The Assassination of Jesse James by the Coward Robert Ford

Children of Men

Women

Barry Lyndon

Birdman

Blood Simple Simple

Black Narcissus

In The Mood For Love

Monday, September 8, 2025

Broadcom -- XPUs -- On A Tear -- September 8, 2025

Broadcom is getting a lot of attention now, and the stock is having a parabola move, as they say. 

It's all due to XPUs.

"The Secret Sauce," Michael Fitzsimmons, September 8, 2025. If unable to access due to paywall, the next link is just as good, at Morningstar:

  • "Buckle up, torrid AI growth is accelerating," September 5, 2025. Link here. Morningstar raises its fair value estimate for Broadcom stock.
  • fair value estimate: $325
  • three-star rating
  • moat: wide
  • uncertainty rating: high

Broadcom’s AVGO July-quarter revenue rose 22% year over year to $16 billion, with $5.2 billion in artificial intelligence chip revenue rising 63% year over year. Management qualified a fourth custom AI chip customer, which it expects to drive an acceleration in AI revenue growth in fiscal 2026.

AI short- and long-term guidance significantly exceeded our expectations. Management expects up to $10 billion in incremental orders in the second half of fiscal 2026 from the new XPU customer, implying a nearly doubling of total AI revenue next year.

The $10 billion is wholly incremental to our prior forecast for 60% AI chip growth in fiscal 2026. Management indicated that the $10 billion in orders is just the beginning for its shipments to the new customer, which portends a rapid ramp-up to the level of other large customers, such as Google and Meta.

We see potential for even further upside for Broadcom’s XPU sales with three additional potential customers in talks with Broadcom. We surmise the new $10 billion customer might be OpenAI for an inference chip. We also believe Apple and Arm are part of Broadcom’s custom AI chip pipeline.

We raise our fair value estimate for wide-moat Broadcom to $325 per share, from $225, from a materially increased long-term AI chip growth forecast that includes a revenue ramp for the new XPU customer. Shares look fairly valued to us at our new valuation.

Our forecast assumes an acceleration of growth in fiscal 2026, with tapering, but still substantial growth thereafter. Management hinted at another growth acceleration in fiscal 2027 that would imply significant further upside to our updated forecast and valuation.

Shares rose a relatively modest 4% after hours, which implies to us that significant AI revenue upside was already baked in. The stock trades at all-time highs, which now appears justified to us, given the accelerating AI chip revenue opportunity.

XPUs

 An XPU is a broad, abstract term for any specialized processing unit designed to accelerate specific workloads beyond the general-purpose capabilities of a CPU. The "X" acts as a variable that can represent different architectures like GPUs, FPGAs, AI accelerators, or Data Processing Units (DPUs), depending on the task at hand. This concept of heterogeneous computing aims to optimize performance, power, and latency for diverse applications by matching the best processing unit to the right job. 


Here's a breakdown of the XPU concept:

  • Specialized Processing:
    Instead of a single, versatile CPU, an XPU represents a family of processors, each optimized for a particular type of computation.
  • "X" as a Variable:
    The "X" in XPU is like a variable in mathematics, allowing the term to refer to various accelerators like:
    • GPU (Graphics Processing Unit): For visual tasks and parallel processing.
  • FPGA (Field-Programmable Gate Array): For highly customizable hardware functions.
  • NPU/AI Accelerator: For artificial intelligence and machine learning tasks.
  • DPU/IPU (Data Processing Unit/Infrastructure Processing Unit): To offload and accelerate data flow and network tasks in data centers.
  • Heterogeneous Computing:
    XPUs are a core component of heterogeneous computing, which combines different types of processing units to work together efficiently.
  • Optimizing for Workloads:
    The goal is to match the workload (e.g., gaming, video rendering, AI training) with the most suitable XPU to achieve optimal performance, power efficiency, and speed.
  • Software Defined Acceleration:
    The success of the XPU strategy often relies on software, such as Intel's oneAPI, to allow developers to abstract the underlying hardware and select the appropriate processing unit for the task.

Saturday, September 6, 2025

Apple Partnerships -- ChatGPT's Take -- September 6, 2025

ChatGPT prompt:

Is there a "stronger" alliance in the "tech" sector than the "partnership" between Apple and Google? Possibly Apple and Broadcom?

 

ChatGPT reply:

That’s a sharp way to frame it — because in tech, “alliances” aren’t just about who shakes hands, but about who really can’t function without the other. Let’s stack this up:

Apple - Google:

  • Scope: Primarily search and advertising. Google pays Apple an estimated $18–20 billion a year to remain the default search engine on iPhones and Safari.

  • Strength: Huge financial dependency (Google’s traffic → Apple’s devices).

  • Fragility: Antitrust regulators (US, EU) are challenging this arrangement, and Apple has been experimenting with its own search technology. It’s a marriage of convenience, not deep integration.

Apple - Broadcom:

  • Scope: Wireless components (Wi-Fi, Bluetooth, RF filters, networking chips). Apple signed a multi-billion dollar, multi-year deal with Broadcom in 2023 to supply US-made components.

  • Strength: Hardware supply chain dependency — Apple must have these chips to ship iPhones and Macs.

  • Fragility: Apple has a long history of vertical integration (it designs its own CPUs, GPUs, modems eventually). Broadcom is crucial today, but Apple is slowly trying to reduce reliance.

Apple - TSMC (often overlooked):

  • Scope: TSMC manufactures Apple’s custom silicon (A-series and M-series chips).

  • Strength: Absolutely critical. Without TSMC’s advanced nodes (3nm, 2nm soon), Apple simply cannot deliver competitive iPhones, Macs, or iPads.

  • Fragility: Geopolitical risk (Taiwan). Apple has been lobbying hard for TSMC to build capacity in Arizona.

 Apple - Foxconn:

  • Scope: Assembly/manufacturing partner.

  • Strength: Foxconn builds most iPhones — the relationship is decades deep.

  • Fragility: Apple has diversified with Pegatron, Wistron, and production in India/Vietnam to avoid over-reliance.

Takeaway:

  • The Apple + Google tie is financially massive, but not existential — Apple could build or license its own search, Google could survive without Apple (though at a revenue hit).

  • The Apple + Broadcom tie is important, but transitional — Apple can and probably will internalize more chip design.

  • The Apple + TSMC relationship is arguably the strongest and most critical in tech today — deeper than Apple + Google — because it literally determines whether Apple can ship its flagship products.