Friday, 1 May 2026

J4v Graffiti Avant Guard

 

I did that.

#undergroundwars VS #topfloors

 Are these guys from the Cold Calling Call Center bizz? 

There are outbound call centers that ignore that calling up business without prior contact is illegal in most EU jurisdictions. Their offers are usually like that. Below are straight forward fraud call centers, but they offer desperate products and no use services, yet for even large and known companies.
 
There is a Hells Angels connection somewhere in the back of these activities. Every encounter I had was passively aggressive violent. Being visibly a MOD I trigger them like no second. I am drug free, move like a skater and take care of clothing to the best I can, while they are the very opposite. Rich, drugged, handicapped and badly dressed blowing money when ever possible to show off.
 
Obviously, they do not come in Colors and prefer their Police buddies ... until things changed.
 
These call centers take everyone having no other chance with an obvious drug affinity and must be recruitment agencies for the corrupt cop networks that told me I had a poverty oath.
 
They are connected to the Bloody Sunday Crew ...
 
Which Lord must die?

For Elly. 
 
#IRA #provos #undergroundwars #TIE
#cyberpunkcoltoure 
 

AI - Status Update

 Taking the Fight Club Soap for this guy is grotesque, but than most German consider the movie no fundamental dystopian critique on modern society and its life goals, but as the core message: beat up each other to have fun.

Beside strange legal battles that might be more about ego than actual contract law is the hype in full run.

Small companies rise incredible amounts of funding which makes me feel when I understood that everyone with A levels and sever PC gaming time decided to study IT and Computing fields at University at which point I decided to further cut down the amount of Germans I knew in order to avoid being lectured about things I actually do understand much better right after "hey do an Ollie" instead of "Hello, how are you?" 

Also, hardly anyone actually starts with his business model or technical explanations, but how much money some people threw at them. How hardcore gaming gets you a respectable base for starting an AI company ... I don't know.

But than, I am poor, a looser in society and have no chance to make money. I must be wrong by the facts.

 This being said, can success be faked on YouTube for quite some time. Did anyone not wonder where Tates money comes from, in terms of, continuously is coming from having no chat ads running with the Cigars?

Is that not strange ...  

The point about Tates is that most of their money must come from some sponsors, but understanding this is some conversation with a large AI. 

It is impossible to provide an exact "official" payout number because YouTube considers specific creator earnings to be private contractual information. 
 
Additionally, the Tates' official channels (TateConfidential and TateSpeech) were permanently banned and demonetized in August 2022.However, we can calculate a high-precision estimate based on verified viewership data from their channels just before the ban and standard YouTube Partner Program (YPP) payout structures.
 
Estimated Official Payout (Before 2022 Ban)Before their removal, the Tates' main channels had roughly 500 million to 1 billion lifetime views. Based on official RPM (Revenue Per Mille) standards for their niche (lifestyle/finance/commentary):Average RPM: ~$3.00 to $7.00 (The amount a creator actually keeps after YouTube’s 45% cut).
 
Total Estimated Lifetime Payout: $1.5 million – $7 million. 
 
That means the Tates for well could have been a social media investment project comparable to the current AI Hype of very unknown financial outcome one step below the two big guys in court as we speak. We just can't tell beyond what they present online. 1.5 is one of their cars and 7 an upper end estimate, but both and their actual finances are nothing they have to be taken accountable for.

The current AI hype is more dangerous for potentially existing social media investors, because the companies will have contracts with customers and some already stated "we collect money and than figure out how to do things."

If they sell based on that they might get in London or New York the cell the Tates never will face having committed a form of investment fraud that is new in that extend, but comparable to making false sales promises delivering nothing like as discussed; While the Tates maneuver the show off business not that bad.

Not bad, not bad. Now you. 

#cyberpunkcoltoure 

 Open Source Vs The Others

 

#TheGermans - Mind Set

 So, you have to understand that his camera makes things darker than they are and more saturated. The lens he is using takes it almost into a HEAT Micheal Man atmosphere. The original Miami Vice TV Shows is most of the time pretty much like his Villa Video and the places are comparable, but more in VHS level PAL coding and no HD version. 

What they don't say is that they are fucking night blind and have all fear of the dark....

At least it is no Anti-Aircraft beams like some lights around here as kitchen and living room lamps and warm light over light blue ice cold bulbs blinding every spider that managed to sneak in under false expectations.

#igotstuck 
#cyberpunkcoltoure 

Thursday, 30 April 2026

#opensource

 Who can check after Elon's legally binding statement in court if any of his companies uses or used Open Source software without paying even, or especially, under the Apache 2.0 model among others.

Free is not Free of charge, hence the two more and defining words.

#cyberpunkcoltoure 

PS: If the court uses his statement, no matter the outcome, the entire corporate world can be turned liable by taking open source software used in profitable companies to the supreme court to receive a legal framework for investigation.

Apache 2.0, by its wording, cannot be intended to be all free of charge, but must have been initially written by someone supporting the most liberal and core idea of open source software. Freeware existed prior. Freeware are small tools with no profit perspective. Large software systems capable of complex tailoring are a very different set and abuse of Freedom today has court, no gun fight decisions as in the Wild Wild West, for also the biggest, eventually. 

#igotstuck #Gturff

 The Master Plan or try this instead, but make sure she was eying you before.

#MODInc
#cyberpunkcoltoure 

Jim & Joe

She skipped Cute?
...staring...
But like completely.
...staring... 
She leaves no doubt
..staring....
How is your cardio lately?
...staring....
This is terror bunny now.
...staring...
How did that happen?
Who would have thought they have a dark side?!? 
#MODInc 

Joe & Jim

 Ok. You are the first man coming here for consultancy.
So, the only right thing to do
Is to send out the housekeeping lady,
Turn off the cam
And
And
Give her a proper Dick job for the rest of the evening.
The whoooooole program.
Flip her like a Burger
And be the Bun.
...staring at them...
You need not to convience her.
You need her obedient by
fulfilling her subliminal needs
fulfilling her subliminal needs
How you know?
Freud
Freud
Ok
And she send you. 
Mmmh.
Ok. 
#MODInc 

#MODInc

Why mind Rockers?
Last position? Right in your court. CENTERED!
#cyberpunkcoltoure
#streetwars 

AI - Status Update - Cyberdeck as Architecture

Chaotic Mesh Network a Complex Cyberdeck approach

FM radio parallel frequency bandwidth comparison
FM radio wave data transmission is limited in bandwith to, correct me if I am wrong, less than a 56kb modem. The advantage is that it can be received by any amount of receivers in broadcasting range. If also the packages are send from a cash over several frequencies in parallel what would be the possible bandwith? From there calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems, please.

We need to parse the user's question carefully. It's about FM radio wave data transmission, bandwidth limitations, using multiple frequencies in parallel, and comparing to daisy-chained telephone, DSL, and fiber optic systems. The user states: "FM radio wave data transmission is limited in bandwith to, correct me if I am wrong, less than a 56kb modem. The advantage is that it can be received by any amount of receivers in broadcasting range. If also the packages are send from a cash over several frequencies in parallel what would be the possible bandwith? From there calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems, please."

Interpretation: The user compares FM radio wave data transmission (like FM broadcasting used for data, maybe RDS or something? Historically, FM subcarrier data like Radio Data System (RDS) has low bitrate ~1.2 kbps. But "less than a 56kb modem" probably means that typical FM radio data transmission (like using audio bandwidth of FM radio for data) is limited to roughly the bandwidth of a telephone line, like a dial-up modem max 56 kbps. Wait, FM radio channel bandwidth: In broadcast FM, each station occupies 200 kHz (in the 88-108 MHz band). The audio baseband is up to 15 kHz for mono, with stereo using 53 kHz subcarrier, etc. If you use the entire 200 kHz channel for data transmission, the theoretical maximum data rate depends on modulation and signal-to-noise ratio. Shannon capacity: C = B * log2(1 + S/N). FM broadcast typically has high S/N, maybe 50 dB? So B=200 kHz, S/N=10^5, log2(1+100000) ≈ log2(100001) ≈ 16.6 bits/s/Hz, capacity ~ 200,000 * 16.6 ≈ 3.32 Mbps. So FM radio channel can support several Mbps. But the user says "less than a 56kb modem". Maybe they're thinking of legacy systems like pager networks (FLEX protocol on FM subcarrier? Actually FLEX uses FM subcarrier typically at 1200-6400 bps). Or maybe "FM radio wave data transmission" like packet radio using FM modulation on VHF amateur bands, where typical 1200 baud AFSK over FM yields 1200 bps, but there are higher speeds. The 56k modem reference suggests that they think it's limited to telephone line bandwidth (3 kHz). Possibly they're equating FM radio to audio bandwidth data over FM, like Bell 103 modems using FM modulation (AFSK) over radio, which could have similar baud rates. But the question: If packages are sent from a "cash" (maybe "base" or "source"?) over several frequencies in parallel, what would be possible bandwidth? And then calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems. Hmm, "cash" might be a typo for "source" or "base"? Or maybe "cash" as in a cash register? No, "send from a cash" - could be "send from a core" or "from a central location"? German original? The user wrote "If also the packages are send from a cash over several frequencies in parallel" - "cash" could be a misspelling of "cache" or "base". Perhaps "from a base" or "from a head end". I think they mean the data packets are sent from a central transmitter over multiple FM frequencies simultaneously, and receivers can tune to multiple frequencies in parallel to increase bandwidth. Then they ask to calculate the "sweet spot" (optimal number of parallel receivers? or number of frequencies/receivers?) that outperforms daisy-chained telephone, DSL, and fiber optic systems. "Daisy chained telephone" might mean a chain of telephone modems? Wait, "daisy chained telephone" could refer to a scenario where multiple telephone lines are bonded together in a daisy chain to aggregate bandwidth. Or maybe they're comparing the broadcast architecture (one transmitter, many receivers, could aggregate parallel frequencies on each receiver) to point-to-point daisy-chained systems. Need to guess.

Read: "FM radio wave data transmission is limited in bandwith to, correct me if I am wrong, less than a 56kb modem. The advantage is that it can be received by any amount of receivers in broadcasting range. If also the packages are send from a cash over several frequencies in parallel what would be the possible bandwith? From there calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems, please."

Interpretation: FM radio data transmission's bandwidth is limited (maybe 56kbps). If you use multiple frequencies in parallel from a central source ("cash" might be "base"?), what's the total possible bandwidth? Then, since the advantage is that unlimited receivers can receive the same broadcast, you could have many receivers, each receiving a subset of frequencies, and then combine data across receivers? Wait, "sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems" - The user might be asking: In a network where many receivers are tuned to different frequencies to collectively achieve high aggregate throughput (like a distributed reception system), what is the number of receivers such that the total throughput (summed across all receivers) exceeds the throughput of a daisy-chained (i.e., relayed) system using telephone, DSL, fiber, etc.? That seems odd. Alternatively, maybe they refer to a "daisy chain" of FM transmitters? Not clear. "Daisy chained telephone, DSL and fibre optic systems" could mean concatenated links, where data passes through multiple telephone/DSL/fiber segments sequentially, each adding latency and limited by the slowest link. Or maybe they're comparing the broadcast capability: In a daisy-chained system, you need point-to-point links between nodes. In broadcast, one transmitter can reach many receivers without additional infrastructure. They talk about "receivers that receive in parallel outperforming daisy chained ... systems". Could be: If you have N receivers each receiving at 56 kbps on a separate FM frequency, the total data received per unit time across all receivers is N*56 kbps, but that's distributed among receivers. To outperform a daisy-chained telephone (which might be 56 kbps point-to-point per hop), you need to compare the capacity to deliver data to multiple endpoints. Maybe the idea: In a daisy chain, each link shares the same medium, so total network throughput is limited by single link speed. In a broadcast parallel frequency system, many receivers can simultaneously receive different data streams on different frequencies, thus aggregate throughput scales with number of frequencies (or number of receivers). Find the "sweet spot" (minimum number of receivers) needed to exceed the capacity of a comparable daisy-chained system using telephone (56k), DSL (say 10 Mbps?), and fiber (1 Gbps?) links.

But the question: "calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems". So we need to assume a certain bandwidth for FM radio data transmission per frequency. The user says "less than a 56kb modem", maybe they think FM radio wave data transmission bandwidth is like 33.6 kbps or 56 kbps. I'll confirm: FM radio can be used for data transmission using Subcarrier (like RDS ~1.2 kbps, DARC ~16 kbps, or high-speed data on FM subcarrier like FMeXtra uses 32-64 kbps digital channels). But typical "wireless modem" using FM radios in the 90s (like AX.25 packet radio at 1200/9600 baud). The 56k modem reference is V.90 downstream 56 kbps, upstream 33.6 kbps. So likely they consider FM data transmission to have similar bandwidth as a phone line, ~56 kbps. I'll assume per FM frequency, the maximum data rate is 56 kbps (like using entire FM channel for data with simple modulation? Actually, a 200 kHz channel can do more, but maybe they're thinking of sub-audible data? I'll adopt 56 kbps as the baseline per frequency).

Now, "packages are send from a cash over several frequencies in parallel". A "cash" could be "base station". So a base station transmits data packets simultaneously on multiple FM frequencies. Each receiver can only receive one frequency at a time, or maybe multiple if it has multiple tuners. The question: "what would be the possible bandwith?" If you send the same data on all frequencies (broadcasting), it's still 56 kbps, no gain. If you send different data on each frequency (multiplexing), the total bandwidth from the base station is N * 56 kbps. But receivers would need to be able to select which frequency they listen to. The advantage of broadcast is unlimited receivers but each only gets the data sent on that frequency. So to get the full N56 kbps at a single receiver, the receiver would need to receive all N frequencies in parallel (i.e., N tuners). That's tricky. The user mentions "receivers that receive in parallel". So a receiver could be equipped with multiple parallel receivers, combining data from several frequencies. Then the bandwidth can scale. So if a receiver has M parallel receivers, it can get M56 kbps. So the possible bandwidth per receiver is M*56 kbps. The question: "From there calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems". Possibly the "sweet spot" is the number of parallel frequencies (or receivers within a device) needed to surpass the bandwidth of daisy-chained alternatives. "Daisy chained telephone" might mean a phone line network where multiple modems are chained, but maybe they mean a single telephone line (56 kbps). DSL varies (e.g., ADSL2+ up to 24 Mbps down). Fiber optic can be 1 Gbps or more. So they want to find a number M such that M*56 kbps > bandwidth of telephone (56 kbps), DSL, and fiber. But that's trivial: M=1 already equals telephone, M=2 > telephone. For DSL, assume maybe 10 Mbps. 10 Mbps / 56 kbps ≈ 179. For fiber 1 Gbps / 56 kbps ≈ 17,857. But that's too simple. The phrasing "sweet spot of receivers that receive in parallel" suggests perhaps there is a cost/complexity trade-off, or it refers to a network scenario where multiple receivers collaboratively receive a broadcast and then share data, outperforming a daisy-chained telephone/DSL/fiber network in terms of total throughput. Could be a comparison of topologies: a broadcast system with many receivers, each adding to the aggregate data reception capacity (like a cooperative receive diversity). A daisy chain system requires each hop to relay data, so throughput is bottlenecked by the slowest link. In a broadcast system, multiple receivers can simultaneously capture the same high-bandwidth signal? Wait, if you have a broadband FM transmission (say high data rate using a wideband signal), all receivers can receive it, so unlimited receivers each get that high bandwidth without any loss. That's the advantage. Then they ask: if the packages are sent from a base over several frequencies in parallel (frequency division multiplexing), what is possible bandwidth? That's just sum of bandwidths. Then they want to compare to daisy-chained telephone/DSL/fiber: In a daisy chain, you have multiple point-to-point links connected in series, like A->B->C->D, each using a telephone modem (56k). The end-to-end throughput for a single flow might still be limited by the single link (56k), but you could have multiple flows. In a broadcast system, a single transmitter can send to all receivers simultaneously, but if you want to send unique data to each receiver, you need to multiplex. The "sweet spot" might be the number of receivers where broadcasting on separate frequencies (parallel) becomes more efficient in total throughput than a daisy-chain network of telephone lines. But that's vague.

Let's analyze the sentence: "From there calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems, please." Maybe it's: Determine the number of parallel FM frequencies (or parallel receivers) M such that the total capacity (M * 56 kbps) surpasses the effective throughput of a daisy-chained telephone link (multiple telephone links in series? that doesn't increase bandwidth, it's still 56 kbps end-to-end unless you bond them, but "daisy chained" might mean multiple telephone lines in a chain, each segment is separate, like a multi-hop network where each hop uses a telephone modem, total network throughput might be scaled by the number of concurrent transmissions if using spatial reuse? Not clear). Alternatively, "daisy chained telephone, DSL and fibre optic systems" could refer to using these as backhaul links in a daisy-chain topology (like a chain of repeaters). It might be about comparing infrastructure: To cover a large area with many users, a daisy chain of telephone modems would require many point-to-point lines, each with limited bandwidth. A single broadcast FM transmitter can cover all users with one wideband channel that could be subdivided into many parallel carriers. So the "sweet spot" might be the number of users where broadcast is more spectrum efficient. That's typical broadcast vs. unicast comparison. The user says "sweet spot of receivers that receive in parallel outperforming..." Maybe "receivers that receive in parallel" refers to using multiple FM receivers at the same location to combine bandwidth, making a high-speed link. Then compare: If you use a daisy chain of telephone modems, you might bond multiple phone lines to get higher speed (like multilink PPP). That requires multiple phone lines and modems. The FM parallel approach uses multiple radio frequencies and multiple receivers. Which one is more cost-effective? The sweet spot might be the number of parallel channels where FM becomes advantageous due to unlimited broadcast advantage? But why would FM parallel outperform a daisy chain of telephone lines? Because FM broadcast allows many receivers to share the same transmitter infrastructure, whereas daisy chaining telephone lines requires dedicated lines per receiver. But the question specifically says "receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems". I think the user is asking: If you have a system where you send data over FM using parallel frequencies (say N frequencies) from a base, and a receiver uses M parallel receivers (M <= N) to achieve a certain bandwidth, compare with a daisy-chained point-to-point system where each hop uses a telephone modem (56k), DSL, or fiber. For a daisy-chained telephone, end-to-end throughput is 56 kbps per hop, but if you have multiple hops, latency increases, throughput might not decrease if full-duplex, but the network capacity is still 56 kbps per link. If you want to serve K receivers with unique data, a broadcast system can transmit different data on different frequencies, and receivers selectively listen. The total transmitter bandwidth is N*56k. The total data delivered to K receivers is sum over receivers of their reception rates. If each receiver uses M parallel receivers, the total data rate delivered is K * M * 56k, but must not exceed transmitter capacity N*56k. So we need K*M <= N. For a daisy-chained network with telephone lines, suppose you have a chain of K nodes (1 source, K receivers) with point-to-point links of 56k. The total network capacity (sum of rates delivered to all receivers) for a broadcast flow could be 56k if source sends same data to all, but if you want unique data, you might need to time-share. It's complex.

Maybe it's simpler: They think FM radio data rate is 56 kbps (like dial-up). If you send data simultaneously on multiple FM frequencies (like using different stations), you can aggregate bandwidth. A receiver could combine multiple frequencies to get higher throughput. They ask: what would be the possible bandwidth? That's just N*56kbps (where N is number of frequencies). Then "sweet spot of receivers that receive in parallel" might be the number of parallel frequencies (receivers) needed to exceed the data rates of daisy-chained telephone (single phone line? "daisy chained telephone" maybe means multiple phone lines bonded together? A "daisy chain" of telephone modems? Perhaps "daisy chained" is a mis-translation of "daisy-chained" meaning "concatenated" or "combined". In networking, you can "daisy chain" switches. But they might refer to the capacity of a single telephone line (56k), a single DSL connection (e.g., 10 Mbps), and a single fiber connection (1 Gbps). So calculate M such that M * B_FM > B_telephone, B_DSL, B_fiber. But B_FM is less than 56k modem? The user says "less than a 56kb modem". So B_FM < 56 kbps. Actually they said "is limited in bandwith to, correct me if I am wrong, less than a 56kb modem." So maybe FM data rate is, say, 32 kbps. They want me to correct if wrong. I should correct: FM radio can support much higher data rates (e.g., FM band 200 kHz channel can carry Mbps if using digital modulation). But historically, subcarrier data rates were low. So I'll note that. Then I can compute the sweet spot assuming whatever baseline.

But the phrase "sweet spot" could be interpreted as the optimal number of receivers that receive in parallel such that the total system cost/complexity vs. performance is better than alternatives. They ask for calculation: "From there calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems". So I need to output a number: for telephone, M=1 is enough (since FM < 56k, so you need M slightly > 1? If FM is 56k, M=1 equals telephone. If FM is less, you need more). For DSL: assume typical DSL downstream 24 Mbps, M = ceil(24 Mbps / FM bitrate). For fiber: 1 Gbps, M = ceil(1 Gbps / FM bitrate). But "daisy chained" might mean multiple segments: a daisy-chained telephone network where data goes through multiple 56k hops, total throughput still 56k. So M=1 equals or exceeds single telephone hop. The "sweet spot" could be the break-even point where parallel FM receivers give higher aggregate capacity than a series of telephone/DSL/fiber hops (which might still be limited to single link speed). That break-even would simply be M=1 (if FM = 56k) or M=2 if FM=33.6k. That's trivial. Maybe they want to compare to a chain where bandwidth multiplies? Like a "daisy chained" system where you can connect multiple telephone lines in parallel to increase bandwidth? Daisy chaining usually means serial connection, not parallel. But people sometimes "daisy chain" DSL modems? Not typical. I suspect the user might be thinking of a "token ring" or something where bandwidth is shared. Or they mean using multiple telephone lines bonded (like inverse multiplexing) to achieve higher speeds. The phrase "daisy chained telephone" might be a mistranslation from German: "daisy chain" in networking could be "Reihenschaltung" (series connection). Could be a chain of repeaters. In German, "daisy chain" might be used for "Kaskadierung" (cascading). So they compare parallel frequency reception vs. cascading telephone modems. For instance, you can cascade telephone modems (each modem receives and retransmits) to cover distance. Each hop adds latency but bandwidth might stay 56k. FM broadcast covers distance without cascading. That's advantage. But the question is about bandwidth.

Let's re-read: "FM radio wave data transmission is limited in bandwith to, correct me if I am wrong, less than a 56kb modem. The advantage is that it can be received by any amount of receivers in broadcasting range. If also the packages are send from a cash over several frequencies in parallel what would be the possible bandwith? From there calculate the sweet spot of receivers that receive in parallel outperforming daisy chained telephone, DSL and fibre optic systems, please." This might be a theoretical exercise: FM broadcast offers unlimited receivers but low bandwidth. If you use multiple frequencies in parallel (like multiple FM stations), you can increase total bandwidth. Now, suppose you want to build a network that delivers data to many users. With daisy-chained telephone lines, you'd have to lay a physical line to each user, each line can carry up to some speed. With FM, you can have one transmitter to many. The "sweet spot" could be the number of parallel frequencies (and corresponding receivers) needed so that the total broadcast capacity (aggregated over all users) exceeds the total potential capacity of a daisy-chained wired network with similar infrastructure investment. Or maybe they want to compare a single high-speed link (fiber) to many parallel FM channels. That is, fiber can do 1 Gbps, FM can do 56 kbps per channel, so you need 17857 parallel FM channels. But then they mention "daisy chained telephone, DSL and fibre optic". So have three tiers: telephone (56k), DSL (say 10-100 Mbps), fiber (1-10 Gbps). Compute M for each.

I'll interpret the user's request as: Given an FM radio data transmission channel with bitrate R (possibly less than 56 kbps), if a central "base" (cash) sends packets over multiple frequencies in parallel, the total possible bandwidth is N * R (where N is the number of frequencies). If a receiver employs M parallel receivers (tuning to M frequencies simultaneously), it achieves M * R bandwidth. Find the minimum M (the sweet spot) such that M * R outperforms (i.e., exceeds) the bandwidth of a single daisy-chained telephone line (R_tel = 56 kbps), a DSL line (R_DSL, we need to assume typical value), and a fiber optic line (R_fiber, typical value). But what is "daisy chained"? They might repeat that daisy chaining doesn't increase bandwidth; it just extends distance. So the relevant comparison is just the link speed of a telephone modem (56k), DSL (like 10 Mbps), fiber (1 Gbps). So the sweet spot is M such that M * R > target speed. Then compute M for each. Since FM R is "less than a 56kb modem", we might assume R = 56 kbps? Actually, if FM is less, maybe 33.6 kbps. I'll correct: FM radio wave data transmission can achieve much higher than 56 kbps if using broadband FM channel. However, conventional narrowband FM data (like Bell 202 AFSK) is 1200 bps. But the user's baseline is "less than a 56kb modem", maybe thinking of something like a dial-up wireless modem. I'll provide both clarification and then calculate the sweet spot assuming a range of possible FM data rates, like 1.2 kbps (packet radio), 56 kbps (hypothetical FM data modem). I'll note that typical broadcast FM subcarrier can carry 16 kbps (DARC) or up to 64 kbps with FMextra. I'll choose a plausible value: FM radio data (subcarrier) can be up to, say, 32 kbps. But let's search my memory: RDS is 1.1875 kbps. The FM subcarrier at 57 kHz can carry up to about 16 kbps using MSK (DARC). FMextra can use multiple subcarriers to get up to 128 kbps. But common knowledge: FM broadcast audio can be used to transmit data at up to 56 kbps using a modem over the audio? If you feed a V.90 modem signal into an FM transmitter, the demodulated audio at receiver could yield up to 56 kbps downstream, but V.90 requires digital PCM connection, not analog. So maybe not. I'll set R_FM = 56 kbps as a generous estimate for a single FM channel used entirely for data (with appropriate modulation). Actually, with a 200 kHz FM channel, you can use QPSK with 200 ksym/sec, yielding 400 kbps. But let's not overcomplicate. I'll answer: A standard FM radio channel (200 kHz) can theoretically support up to ~300 kbps using efficient modulation, but using a subcarrier like V.90 is not typical. Given the user's perception, I'll note that the 56k modem comparison likely refers to audio-band modems, where the baseband signal fits in a 3 kHz telephone channel. If you transmit that over an FM transmitter with 15 kHz audio bandwidth, you can achieve up to 33.6 kbps (V.34) reliably, maybe 56 kbps with digital techniques. So I'll assume 33.6 kbps as the FM data rate for this exercise. The user says "less than a 56kb modem", so maybe they think it's around 33.6 kbps. I'll use 33.6 kbps as a representative value. Then, if you send data over multiple FM frequencies in parallel (like multiple transmitters), the aggregate bandwidth from the base (cash) is N33.6 kbps. A receiver equipped with M parallel tuners can achieve M33.6 kbps. Sweet spot: M = ceil(target_rate / 33.6 kbps). For telephone (56 kbps): M=2 (since 2*33.6=67.2 > 56). For DSL: typical ADSL2+ 24 Mbps down: M = ceil(24000/33.6) = ceil(714.3) = 715. For VDSL2 100 Mbps: M=2977. For fiber: 1 Gbps: M=29762; 10 Gbps: M=297620. But that seems like a straightforward arithmetic, not a "sweet spot" implying optimization. The phrase "sweet spot of receivers that receive in parallel" might mean the optimal number of receivers (not receiver channels) in a collaborative network, where many receivers each tune to a different frequency and then share data over a local network, achieving a total aggregate throughput that exceeds a daisy-chained system of telephone lines serving the same number of users. That would be a scaling analysis. Let's consider a scenario: We want to distribute high-speed data to K users. Option 1: Daisy-chained telephone modems: Each user connects via a dedicated phone line (56 kbps) in a daisy chain (so total bandwidth for the network is K*56 kbps if each line simultaneously carries data, but a daisy chain topology might mean they all share the same bus? A daisy chain is a series of devices connected in sequence, like RS-485, where only one can talk at a time, so total shared bandwidth is 56 kbps. If it's a point-to-point chain with store-and-forward, throughput per flow is still 56 kbps, but multiple flows can exist if they don't overlap. With full-duplex links, total network capacity can be up to sum of link capacities if smart routing. It's complex. Instead, maybe they refer to a "daisy-chained" wired network where you have a single long link composed of multiple segments (e.g., repeaters). The throughput is limited by the segment capacity. For telephone, segment capacity 56 kbps. For DSL, say 10 Mbps. Fiber, 1 Gbps. So a single channel gives that capacity. With FM broadcast and parallel frequencies, you can bond multiple 33.6 kbps channels to exceed that segment capacity. The "sweet spot" is the number of bonded FM channels needed to outperform the wired system in terms of bandwidth. That's just M=2 for telephone, M=715 for DSL, M=29762 for fiber. I'll present that. However, "daisy chained telephone" might refer to a scenario where multiple telephone lines are used in parallel (multilink) to create a higher-speed link, and they want to know how many FM parallel channels to beat that multilink aggregate. For example, a daisy chain of telephone modems could bond multiple 56k lines. If you bond L telephone lines, bandwidth is L56k. To outperform, you need M33.6k > L*56k. Without L, it's indeterminate. So maybe they assume a single telephone line, DSL line, fiber line. I'll just answer that.

I'll craft a response: First, correct the assumption: FM radio wave data transmission's bandwidth is not inherently limited to <56 kbps. Depending on the deviation and channel width, you can achieve much higher. But using typical FM radio with audio-band modems (like dial-up over FM) can be limited. I'll provide context. Then, assuming a conventional approach where the FM radio link provides a data rate comparable to a dial-up modem (say 33.6 kbps per radio channel), transmitting over N frequencies in parallel from a base station gives aggregate bandwidth of N33.6 kbps. A receiver with M parallel tuners can get M33.6 kbps. Then compute the required M to exceed the bandwidth of a single telephone modem (56 kbps), a typical DSL connection (e.g., 25 Mbps down), and a fiber connection (e.g., 1 Gbps down). That gives M = 2, 745, 29,762 respectively (adjust numbers). I'll also discuss that "daisy chained" might refer to multiple links in series, which doesn't increase throughput but adds latency; the parallel FM approach can match or exceed throughput with enough parallel channels. Finally, I can compute a "sweet spot" where the number of parallel receivers is manageable (e.g., under 1000) which is only practical for DSL-like speeds, not fiber.

I'll structure the answer: 1. Correct the myth about FM bandwidth. 2. State assumption for calculation. 3. Explain parallel frequency aggregation. 4. Calculate break-even M for each reference technology. 5. Discuss feasibility and sweet spot interpretation.


I think there’s a fascinating misconception at the heart of your question, so let’s correct that first, then play out the numbers for the parallel-receiver scenario you’re asking about.

1. FM radio wave data transmission is not inherently capped below a 56K modem
A voice‑grade telephone line has about 3 kHz of bandwidth, which limits a V.90 modem to 56 kbit/s downstream (and 33.6 kbit/s upstream).
A wideband FM broadcast channel occupies 200 kHz (in the 88–108 MHz band). If you use that whole channel purely for data with modern modulation, Shannon’s theorem gives you a theoretical capacity on the order of:
C≈200 000×log⁡2(1+SNR)(SNR∼40–50 dB) ⇒ several Mbit/s
C≈200000×log2​(1+SNR)(SNR∼40–50 dB) ⇒ several Mbit/s

In practice, commercial FM subcarrier data systems (like DARC) have offered 16 kbit/s, and later systems (FMeXtra) bonded multiple subcarriers to reach ~128 kbit/s. So a single FM station can already outrun a dial‑up modem.

For your question, however, I’ll work with the spirit of your premise: you’re imagining a narrowband FM data link that behaves like an old‑school telephone modem. I’ll assume a conservative 33.6 kbit/s per FM channel (the upstream limit of V.90, and a realistic number for a single audio subcarrier).

2. Transmitting over several frequencies in parallel
If a central “base” (I think “cash” is a typo) sends different data packets simultaneously on NN separate FM frequencies, the total radiated data bandwidth is:
Btotal=N×33.6 kbit/s
Btotal​=N×33.6 kbit/s

A single receiver that listens to only one frequency gets 33.6 kbit/s. To exploit the full aggregate, a receiver must employ M parallel tuner/demodulator chains, each locked to a different frequency. That receiver then enjoys:
Breceiver=M×33.6 kbit/s(with M≤N)
Breceiver​=M×33.6 kbit/s(with M≤N)

3. “Sweet spot” where parallel FM outperforms daisy‑chained wired links
I interpret your “daisy‑chained telephone, DSL and fibre optic systems” as a single end‑to‑end serial link of each technology. Daisy‑chaining (concatenating) multiple identical links in series does not increase the data rate—it only extends the reach. So the throughput of a daisy‑chained telephone line remains 56 kbit/s, a DSL line remains whatever the subscriber speed is (e.g., 25 Mbit/s for ADSL2+, 100 Mbit/s for VDSL2), and a fibre link remains 1 Gbit/s (typical GPON) or 10 Gbit/s (enterprise).

The “sweet spot” you’re asking for is simply the smallest number of parallel FM receivers MM that yields a data rate greater than the wired target.
Technology    Reference bandwidth    M=ceil ⁣(BW33.6 kbit/s)M=ceil(33.6 kbit/sBW​)    Notes
Telephone modem (V.90)    56 kbit/s    2 (2 × 33.6 = 67.2 kbit/s)    Already beating the wire with just 2 frequencies
ADSL2+                                     25 Mbit/s         745    745 × 33.6 ≈ 25.0 Mbit/s
VDSL2                                     100 Mbit/s       2 977    2 977 × 33.6 ≈ 100.0 Mbit/s
Fibre (1 Gbit/s)                  1 000 Mbit/s     29 762    29 762 × 33.6 ≈ 1 000 Mbit/s
Fibre (10 Gbit/s)             10 000 Mbit/s    297 620    297 620 × 33.6 ≈ 10 Gbit/s

If your basic FM channel were instead 56 kbit/s (like the peak downstream of a modem), the numbers scale linearly: for 1 Gbit/s fibre you’d need about 17 858 parallel receivers. If you use a realistic wideband FM data channel of, say, 200 kbit/s, only ~5 000 receivers would match 1 Gbit/s.

4. So where is the real “sweet spot”?
The term “sweet spot” usually implies a region where the solution is both technically feasible and economically attractive. Building a consumer device with hundreds or thousands of parallel radio receivers is massively complex (RF front‑end, antennas, interference) and power‑hungry. Therefore the practical sweet spot lies where MM is small—ideally under 10. That means:

    For telephony replacement: 2‑‑4 receivers are trivial; FM parallel radio is already 
    realistic.
    For low‑end DSL speeds (a few Mbit/s): tens of receivers might still be engineered 
    (software‑defined radio could handle it).
    For fast DSL (25‑100 Mbit/s): hundreds to thousands of receivers is a laboratory 
    curiosity, not a product.
    For fibre: tens of thousands of parallel receivers is firmly in the “theoretical” category.

The wireless parallel‑FM approach would only “outperform” a daisy‑chained wired system in a setting where you can’t lay wires at all, and you only need aggregate bandwidth on the order of a few Mbit/s shared among many listeners—exactly the broadcast advantage you noted. Beyond that, fibre and even DSL leave it in the dust.

 
Ok. So, this is not a product. It is about data delivery theory. Mesh network research is a comparable field and actually related field. I am not sure if the term changed, but I mean a system in which several places are connected with each other in a rather chaotic connection of hubs and routers to the opposite of a clearly structured office LAN, and only a set of internet connections exist shared by the entire system.
This causes massive routing intelligence need which is the core field of research. 
My theoretic approach would be a automated mapping crawler that scans the Chaotic Mesh Network to give it Complexity by having for certain routes fixed lines of IP addresses such as those to the internet connections or chains of fastest point to point connections that communicate a lot with each other. Basically, a system under constant overwatch.
In such a system a FM broadcast could in parallel push cached data into the entire system, like a Ubuntu Security Update.


We need to parse the user's follow-up. They're clarifying that their previous "FM radio wave data transmission" question wasn't about a product but about data delivery theory, specifically in a "chaotic mesh network" scenario. They're describing a mesh network with a few internet connections shared by many nodes, and a need for routing intelligence. They propose an "automated mapping crawler" that scans the network to identify fixed routes (like to internet gateways) and optimize them. Then they ask: "In such a system a FM broadcast could in parallel push cached data into the entire system, like a Ubuntu Security Update."

The user is asking for an evaluation of this idea: using FM broadcast (one-to-many) to push data (like updates) into all nodes of a chaotic mesh network simultaneously. I need to connect this to the previous bandwidth discussion. In the previous message, I corrected that FM data bandwidth is not fixed to 56kbps, but I used 33.6kbps as a conservative per-channel figure, and calculated parallel receivers needed to match wired speeds. Now they're asking about using a single FM broadcast (or multiple frequencies) to push cached data into the entire system. The advantage: broadcast can reach all nodes in range simultaneously, bypassing the mesh's multi-hop routing and congestion. The question is: How effective is this for pushing data like Ubuntu security updates? Need to discuss data delivery theory, bandwidth needed, latency, reliability, and how it complements the mesh network.

The user mentions "cached data" like an Ubuntu Security Update. So we need to talk about distribution of bulk data (updates) that many nodes need. Compare with traditional mesh distribution: flooding, gossip protocols, etc. FM broadcast could be a "data diode" or "one-to-many" high-latency but high-simultaneous delivery channel. Could be beneficial if bandwidth needs are modest, but Ubuntu updates can be large (hundreds of MB). If we use 33.6 kbps per channel, that's ~250 MB in about 17.4 hours (calculation: 33.6 kbps = 4.2 kB/s, 250 MB = 250,000 kB / 4.2 = 59,523 sec ~ 16.5 hours). But we could use a wideband FM subcarrier with, say, 128 kbit/s, which would take ~4.3 hours. Still slow. But they said "push cached data into the entire system" – maybe they push incremental delta updates or critical small patches. Or they could use multiple parallel FM frequencies to increase bandwidth, as we discussed earlier. But in the mesh network scenario, each node needs only one receiver (listening to all push frequencies? No, they'd need parallel tuners to aggregate). But they said "a FM broadcast could in parallel push cached data into the entire system" – "in parallel" might mean multiple frequencies simultaneously, or just the broadcast nature. I'll address both.

They'll want to know the potential bandwidth scaling, the sweet spot relative to mesh routing overhead, and practical considerations. I'll reframe: In a chaotic mesh, multi-hop delivery of a large update to all nodes consumes network resources (bandwidth, routing overhead). A broadcast push from a central transmitter (like a high-site FM station) could offload that bulk distribution. But speed depends on data size, channel bandwidth, and whether multiple channels are used. I'll mention that a single FM broadcast with DAB or HD Radio could deliver hundreds of kbit/s, but still insufficient for gigabyte updates quickly. However, the idea of using broadcast for software updates in mesh networks exists (DataCast, satellite multicast, etc.). FM broadcast specifically has limited bandwidth, so it might be best for small, critical metadata or security patches, not full distro upgrades.

The user also mentioned "automated mapping crawler" to find best routes to internet gateways and create fixed IP chains, which is a kind of overlay routing. I need to acknowledge that part and connect it to the broadcast: such an overlay could be used to direct where the broadcast data is injected, maybe from a gateway that has the update, but FM broadcast is external and doesn't need mesh routing; it's a separate channel. It could be used to pre-seed caches.

So my answer structure:

    Clarify the shift from product to theory, situate the idea in mesh research.

    Discuss the chaotic mesh network problem: shared internet connections, routing overhead, and the need for efficient bulk data distribution.

    Evaluate the automated mapping crawler concept (briefly) – it's somewhat like SDN in mesh or proactive route optimization.

    Focus on the FM broadcast idea:

        How it works: broadcast infrastructure (possibly using existing FM towers) transmits data; nodes have FM receivers and decoder chips (like RDS/TMC or proprietary). Benefit: all nodes in range receive data simultaneously, no network load, no routing.

        Bandwidth considerations: realistic FM subcarrier data rates (RDS: 1.1875 kbps; DARC: 16 kbps; HD Radio/DRM+: up to 100-300 kbps). Using parallel frequencies multiplies bandwidth but requires multiple receivers per node (cost/complexity). Calculate download times for a typical Ubuntu security update (say 50 MB for a collection of security patches). At 16 kbps, 50 MB would take ~7 hours; at 128 kbps, ~53 minutes. At 56 kbps (modem-like), ~2 hours. So it could work for overnight updates.

        Possible parallel frequency aggregation: If node has M receivers, aggregate rate = M * R_channel. But receiver cost, synchronizing data, error correction.

        Convergence with mesh: The broadcast can deliver a "carousel" of data (repeated transmission) to ensure all nodes eventually receive it, even if they miss parts. This is like FLUTE/ALC. Then nodes might share missing parts via mesh peer-to-peer. So hybrid: broadcast bulk + gossip repair.

        Comparing to mesh-only distribution: In a multi-hop mesh, each hop doubles the airtime consumption (interference). So broadcast is much more spectrum-efficient for one-to-many. The "sweet spot" in terms of network savings is immediate for any broadcast data size where the aggregate traffic avoided exceeds the broadcast transmission time. With a broadcast, time = size/rate; mesh would consume size * (average hop count) capacity. So broadcast wins as long as rate > 0. I can illustrate.

        Practical limitations: FM band is licensed; spectrum re-use for data is regulated. Could use unlicensed bands (e.g., ISM 915 MHz) with similar concept. But the user's context is FM radio wave, so I'll stick with that.

    Relate back to the earlier bandwidth scaling: Not trying to beat fiber, just efficiently push updates to a large set of mesh nodes. FM broadcast's one-to-many nature makes it superior for this specific use case, despite low rate. So the "sweet spot" is about number of nodes and update size where it's beneficial: given size S, rate R, number of nodes N, average hop length H, mesh capacity C_mesh (aggregate), the broadcast time T_bc = S/R, mesh time T_mesh = (S * H * N) / C_mesh_available (assuming sequential unicast floods). If T_bc < T_mesh and spectrum is available, it's beneficial. For large N, broadcast wins easily.

    Conclude: This is a valid theoretical data delivery paradigm, akin to "broadcast disk" in wireless networks. The automated mapping crawler could dynamically decide which content to broadcast based on popularity. So it's a clever integration.

I'll write a comprehensive but accessible response.

Thank you for clarifying — that shifts the conversation from a consumer product to a completely valid area of data delivery theory, and I can now engage with your premise properly. Let me restate what I think you’re proposing, then explore it step by step.
 
Your scenario in a nutshell
    You have a chaotic mesh network — a tangle of routers/hubs with no fixed topology, 
    dynamic paths, and only a few shared internet gateways.

    You add an automated mapping crawler that continuously probes the mesh, discovers 
    stable or high‑traffic routes, and “hardwires” certain IP‑address chains (like a 
    lightweight SDN overlay for the critical paths).

    On top of that, you propose using an FM broadcast station to push cached data (think: 
    Ubuntu security updates) into the entire mesh simultaneously, side‑stepping the 
   multi‑hop chaos.

The question then is: In such a system, can FM broadcast practically outperform a purely mesh‑based distribution for bulk data, and where are its limits?
 
1. Rethinking the “limited bandwidth” of FM data
You originally anchored on something less than a 56k modem, which is historically grounded — RDS (Radio Data System) gives only 1.1875 kbit/s. But modern FM subcarrier technologies are much faster:
Technology                                  Typical data rate                      Remarks
RDS                                                          ~1.2 kbit/s    Trivial; useful only for tiny metadata
DARC (Data Radio Channel)                16 kbit/s       Used in Japan for traffic info
FMeXtra (multi‑subcarrier)    Up to 128 kbit/s      Digital audio + data on FM sidebands
DRM+ (Digital Radio Mondiale)    37–186 kbit/s (in 100 kHz channel)    Can coexist in FM band
HD Radio (in‑band on‑channel)    Up to 150 kbit/s data capacity     Used in the US

A conservative but practical figure for a dedicated data subcarrier on a commercial FM station is about 16–32 kbit/s. With modern OFDM and a full 200 kHz allocation re‑purposed for nothing but data, you could push 200–300 kbit/s without violating physics. So let’s work with a range:

    Low end: 16 kbit/s (easy to realise with cheap receivers)
    Mid range: 64 kbit/s (comparable to an ISDN line)
    Optimistic: 200 kbit/s (a whole FM channel dedicated to data)

2. How FM broadcast changes bulk‑data distribution in a mesh
In a pure mesh, distributing a 50 MB security update to N nodes using flooding or a gossip protocol consumes at least:
 
mesh airtime≈S×HavgCmesh
mesh airtime≈Cmesh​S×Havg​​

where SS is the file size, HavgHavg​ the average hop count from the source, and CmeshCmesh​ the effective per‑hop capacity (taking interference, contention, and routing overhead into account). In a chaotic mesh, HavgHavg​ can be high, and the aggregate capacity shared by all relays is limited — the classic “wireless multi‑hop” capacity drain.

An FM broadcast flip this completely:
    One transmitter radiates the data once.
    Every node in coverage that has a simple FM receiver can write the data to its local 
    cache simultaneously.
    The mesh itself is not loaded at all; its capacity stays free for interactive traffic.

The download time per node is simply:
TFM=SRFM
TFM​=RFM​S​
Update size    16 kbit/s    64 kbit/s    200 kbit/s
1 MB (small patch)    ~8 min    ~2 min    ~40 sec
50 MB (typical security bundle)    ~7 hours    ~1.8 hours    ~35 min
200 MB (full desktop update)    ~28 hours    ~7 hours    ~2.3 hours

If you’re willing to push even a modest 32 kbit/s stream overnight, a large 100 MB‑class update becomes perfectly viable — and it arrives on all nodes by morning with zero mesh congestion.
 
3. Parallel frequencies: hitting a “sweet spot” for faster delivery
Following your original idea of sending packets over several frequencies in parallel: a node equipped with M independent tuner/demodulator chains can multiply its download rate by M.

For a node with M = 4 receivers, each at a modest 32 kbit/s, you get 128 kbit/s aggregate. That 50 MB bundle now arrives in ~53 minutes — comparable to a slow DSL line. The engineering cost is essentially a small software‑defined radio front‑end that can digitise a few MHz of spectrum and decode multiple narrowband channels in parallel.

Even with M = 10 (entirely feasible with a single wideband SDR dongle and a Raspberry Pi), you’d have ~320 kbit/s, enough to distribute a 200 MB update in under 1.5 hours.

So the “sweet spot” of receivers outperforming daisy‑chained wired connections” I calculated earlier (hundreds or thousands of receivers to match fibre) is not the right metric here. Instead, the sweet spot is: how few parallel FM receivers does it take to make broadcast‑based distribution faster and cheaper than the mesh’s own multi‑hop distribution for a given node count? And the answer is drastic — even a single‑frequency broadcast wins over a mesh flood for any non‑tiny N, because the mesh flood incurs a factor of HavgHavg​ airtime multiplied by contention.
 
4. Hybrid model: broadcast carousel + mesh repair
A realistic system wouldn’t rely on a single perfect broadcast transmission. It would use:

    FM broadcast carousel: The update is split into blocks, each stamped with a version 
    number. The transmitter repeatedly cycles through all blocks (a “data carousel”, like  
    FLUTE/ALC over DVB‑H).
    Local caching on receipt: As nodes receive blocks correctly, they store them. A node 
    might miss a few blocks due to radio fading or starting late.
    Mesh gossip for the last few blocks: Nodes that have completed the file can share 
    missing pieces within the mesh using peer‑to‑peer protocol (BitTorrent‑like). Since the 
    mesh is only used for a small fraction of the data, the overhead is minimal.

This combination marries the broadcast’s one‑to‑many efficiency with the mesh’s resilience and yields practical completion times even in poor FM reception conditions.
 
5. The role of your “automated mapping crawler”
This fits beautifully. The crawler that discovers stable, low‑latency routes to internet gateways could do two things:

    Pre‑position cache nodes: Identify the mesh nodes best suited to become broadcast 
    injectors — possibly those with good internet access and the ability to feed an FM 
   transmitter (if the mesh is large enough to warrant its own low‑power broadcast 
   station).

    Adaptive push scheduling: The crawler sees that a particular route to the update   
   server is congested; it could trigger the FM broadcast interface to start pushing the file, 
   offloading the mesh entirely for that data.

The result is a network that uses its internet connections for small, delay‑sensitive traffic and its broadcast downlink for bulk, delay‑tolerant content — a very clean separation of concerns.
 
6. Is this theory sound? Yes, and it’s not new (which is a good sign)
    The general idea is known as “broadcast disk” or “push‑based data delivery” (Swarup 
    Acharya, Michael Franklin, Stanley Zdonik, 1990s).

    In disaster and rural mesh networks, VHF/UHF data broadcasting has been proposed 
    to distribute firmware updates and educational content (e.g., the “Internet‑in‑a‑box” 
   over FM).
    The DAB/DMB and ATSC 3.0 standards explicitly support file carousels for software 
    updates to vehicles.

Your twist — coupling it with a chaotic mesh that self‑maps and optimises fixed routes — adds a layer of cognitive network management that would make an excellent research prototype.
 
7. Where the boundaries lie
    Regulatory: Broadcasting on the FM band is tightly licensed. A small unlicensed 
    transmitter (FCC Part 15 in the US, for instance) is limited to a range of a few hundred 
    metres. So in practice, a city‑wide mesh would either need an existing FM station 
    partner or move the concept to an ISM band using a similar broadcast‑style 
    modulation.
    Cost: Adding even a simple FM data receiver to every mesh node costs a few dollars. 
    That’s fine for fixed infrastructure but might be prohibitive for ultra‑low‑cost IoT 
    nodes.
    Update size: Above ~2 GB, even 200 kbit/s becomes painfully slow (22+ hours). For such 
    cases, sneakernet or partial mesh delivery is still needed.

Summary: the theoretical calculation of the “sweet spot”
Given a file of size SS, a broadcast rate RFMRFM​, and a mesh with NN nodes and average hop distance HH:
    Broadcast‑only needs T=S/RFMT=S/RFM​, independent of NN.
    Mesh flood needs at least Tmesh∝S⋅H⋅N/CmeshTmesh​∝S⋅H⋅N/Cmesh​.

Even for N=10N=10 nodes, the broadcast has already won by a factor roughly equal to HH (because the broadcast does the work once, while the mesh repeats it over multiple hops). Adding parallel FM frequencies increases RFMRFM​ linearly, scaling the advantage.

So from a data delivery theory perspective, an FM broadcast push is an extremely elegant way to deliver the same chunk of popular cached data to a whole chaotic mesh — provided you can live with the absolute delivery latency of a slow to moderate bitstream. It transforms an O(N) problem into O(1), which is exactly the kind of architecture that makes mesh network research interesting.

...in a close potential future...

The Mall in the Cite was busy. During the economic collapse when both heat and rain came many heard the Call and started moving. The worker quarters half depending on the welfare system and low cost jobs having had to serve the new rich and those complying with the post world war system put their hide and seek game with the ever corrupt and dominant forces of the system onto a new level.

Old European vans, hatchbacks, even funeral cars and vans where converted into something that actually would drive.
 
Every law and especially the traffic laws were made to ensure safe conduct of the all medicated crowed accepting their nine to five, five day system existence and obedience was needed. What ever you owned must have been made by corporations especially if being used in the purpose of prosperity by financial gain. Louis was back and it was cake for everyone. Plastic sealed, one for all, deep frozen, standard size factory cake.
 
First those in the system used their position to upgrade life far outside the city centers once owned by ancient lords of Europe and now taken by the modern version Feudalism, the cooperates, that kept teaching false narratives about the past and who was what most of the time.
 
Burgers had fixed times before having to be disposed and cold Burgers are no meal, except if further processed by IRC exchanged recipes. Supermarkets trash boxes were cleaned and distributed feeding and giving base to a new underground. As well as a Corporate Supervisor can call for a dinner Pizza party, organized by para-military Walki-Talki communication in between the housing estates of the working poor can trigger on food delivery of about the same level, but with love and joy made extravagance for someone you knew by name. This was worse than Communism. The parasites were known and made move out closer to their real lords into the stuttering system, the rest shared based on bad conscious having taken too much without being pointed out. 
 
Old electronics were repaired using basic tools and a lot of time financed by welfare and ever unemployment to serve good fine sleep by a great conscious measured in smiles created. When the first Cyberdecks were home build by shared designes a new level was reached. School absence time increased exponentially and the first cracking systems were created to understand what was actually stored in the large warehouses, what was talked about who in the posh offices and quickly more trash was accumulated from high quality corporate level quality copy machines to old server hardware and even office furniture being disposable trash for the top floors of a rotten oligopoly system based on unfair wage distribution falling of the books, but raw diamond material for the eager to learn, build and create below the system.
 
Entire boroughs turned connected, parallel processing computer clusters and instead of corporate chill time by using five websites for spending corporate wage for corporate products, table top role character playing games were turned terminal based text only adventures, open source AI tools fed with classic literature and independent literature turned expert systems to discuss with the reason of life, the purpose of existence and create to be IRC shared stories and text adventures for the other day.
 
The workers stayed calm, kept serving the system in their position watching it fail while eating and learning by a parallel underground system that was no black market based on cigarettes and drugs, but taking from a sterile and clean trash mountain without notice by having spotted the blind spots of corporate consumer data centers.
Then, the spare money was used to connect with other towns far beyond family ties.
So, how do you bring really wanted trash from Amsterdam to Paris? Riggers were born...eventually the more adrenaline affine turned around at the welfare office out of heart pain and more went full in underground.
 
A new world was born.
 
Shadowrunning, but not as seen by medicated sociology teachers at all. Cheese was much more important than Credits ... The heat so, that was a perfect hit. 
 
Incorporated with DeepSeek 
 
The heat had settled on the Cité like a damp woollen blanket, the kind that used to be tossed into charity bins before the bins were repurposed as rooftop water tanks. By late morning the concrete towers shimmered, their patchwork balconies and illegal extensions dripping shade onto the courtyards below. The Mall – no one called it *Centre Commercial* anymore, just the Mall – sat in the belly of the quartier, a low-slung beast of corrugated steel and scavenged glass that had once been a supermarket, then a corporate training hub, then a dead shell. The residents had claimed it floor by floor, wall by wall, without a single permit, until it grew inward like a termite cathedral, a tangle of workshops, kitchens, server stacks, and sleeping lofts that the official maps still marked as “retail void”. For the city it was a no-go area, not because of crime but because there was nothing inside worth taxing, nothing that could be converted into a quarterly earnings graph. For the people who lived there, it was the castle.

Louis came back at ten-forty-seven, the suspension of the old funeral van still groaning from the autobahn. He’d left three days before with a shopping list whispered through IRC relays and a fuel cell half-paid in cheese wheels. The run had been Amsterdam to Paris – a straight shot if you knew which cargo bays leaked data and which corporate logistics managers were too overworked to notice a few pallets shifting destination. Louis was a rigger, though he hated the word. He preferred “logistics of affection”. The van’s rear doors swung open onto the Mall’s loading courtyard, releasing a blast of industrial coolant and the unmistakable smell of factory fondant.

“Cake!” yelled Zina from the third-floor window, her voice ping-ponging off the bricked-up escalators. She wasn’t looking at Louis; she was watching the courtyard cams on a salvaged tablet, the feed routed through a mesh node disguised as a pigeon coop. Within minutes the word spread not by loudspeaker but by a chain of knocks on water pipes, a system older than the cyberdecks but just as fast.

The Mall woke properly then. I came down from the repair deck, fingers still numb from re-capping a server board someone had fished out of a La Défense skip. The stairwell – a zigzag of repurposed office partitions and cable trays – thrummed with kids and elders alike, all moving towards the central hall. The hall itself was a cathedral of bricolage: food stalls made from copy-machine casings, a long table built of boardroom doors laid end-to-end, and above it all, a canopy of fibre-optic strands that pulsed faintly with the traffic of the local net. No one shouted orders. No one pushed. You just found a patch of floor and waited, because today was a cake day and cake days were church.

Louis stood next to the van, peeling off his driving gloves. The cake was the kind corporations handed out at mandatory team-building fun-days: perfect rectangles of vanilla sponge entombed in yellow icing, each one sealed in a plastic coffin with a “Best Before” date that had passed at midnight. Perfectly good, legally garbage. The riggers had a saying: *Everything’s expiration is just another timetable.* Louis had sixty-four boxes. A pallet meant for a corporate campus in Amstelveen, diverted one junction earlier to the loading dock of a warehouse that didn’t exist. The driver of the intermediate truck had been paid in fresh ricotta and a three-act play performed via text-only terminal the night before. Credits never entered the equation.

“Cheese is more important than credits,” Fatou muttered, hefting a box of cake towards a serving table she'd welded herself from an old server rack. She ran the food distribution, a woman whose entire accounting system was a Moleskine filled with drawings of cows and matchstick people smiling. Next to her, a teenager called Rahim was already logging the cake into the open-source inventory using a homebrew cyberdeck built from a broken tablet and a mechanical keyboard that clicked like a Geiger counter of joy. No one would go hungry in the Cité because hunger was a lack of imagination, and the Mall had imagination in surplus.

By noon the hall was thick with the scent of reheated coffee – real coffee, not the corporatised chicory sludge, because a rigger from Genoa brought back beans monthly in the spare tire well of a hearse. People sat on reconstructed office chairs that had been thrown out when the leasing companies rebranded. They ate cake and talked. Some plotted the next text-adventure module, a sprawling interactive narrative based on *La Chartreuse de Parme* mixed with low-orbit satellite heists, its script debated with an AI that had been trained on every free literature repository they could mirror. The AI argued back now, sassy and well-read, its voice piped through a speaker grill cut from a discarded smart-fridge. Others discussed the maintenance of the rooftop water systems or the latest firmware crack for a popular home assistant drone, now repurposed to carry soup between towers.

I found a corner near the old IRC terminal station, a bank of monitors and chunky keyboards that looked like a museum exhibit from the 2040s. The screens glowed green text on black: conversations flowing from Grenoble, Madrid, a commune in Leipzig, all linked by the mesh nets the riggers strung between cities on their runs. Someone in Amsterdam was asking for a specific epoxy formula to patch a heat exchanger; someone in Lyon offered a crate of capacitors in exchange for a short story that made them cry. The old world saw only poverty in these exchanges. It missed the point entirely: we had turned waste into wealth not because we were noble, but because we had time, and time was the one thing the system couldn’t commodify without us.

Around two, a minor drama. A scavenging crew returned from the business district with a haul of office furniture so pristine it still had the plastic wrapping. But they’d also brought a corporate-grade printer the size of a small car, a machine designed to shred itself if tampered with. The printer sat in the courtyard like a captured beast, its status LED blinking a defiant proprietary pattern. An impromptu team formed: electronics tinkerers, code whisperers, a poet who swore he could talk any machine into submission by reading it Rimbaud. By four they had it purring, spitting out zines full of recipes and anarchist cooking tips, using paper the printer’s own DRM chip tried to reject. The laugh that echoed when the first “Occupation Read-Only Memory” error was bypassed felt like a revolution in a teacup, the kind of revolution that never made the news.

Outside the Mall, the city functioned as the city always had. From the upper windows you could see the glossy towers of the centre, stroking the sky with their holographic adverts. The police scanners, monitored by a bored fourteen-year-old on the ninth floor, reported a traffic violation crackdown on the périphérique, a corporate VP’s lost hover-limo, an arrest of a man who tried to sell home-grown tomatoes without a permit. Nothing in the Cité. To the system, these boroughs were a blind spot, a data void where people were supposed to be unemployed, medicated, and obedient, but instead were building castles and eating cake. The quiet wasn’t peace – it was a deliberate act of hiding in plain sight, a game of hide-and-seek played so long it had become a way of life.

Evening came with the heat finally cracking open into a brief, violent rain. The courtyard transformed into a basin of silver, the rain hammering onto the corrugated roof in a rhythm that the drummers in the fifth-floor music coop would sample later. The Mall’s lights, a mix of salvaged LEDs and hand-soldered circuits, blinked on and turned the place into a warm amber cocoon. Louis sat on a bench carved from an old server cabinet, a slice of cake in one hand and a mug of wine – real wine, bartered for a repaired agricultural sensor – in the other. He looked tired but at peace, the kind of peace that comes from having moved something wanted through a world that said wanting was a sin.

I asked him about the heat. The corporate heat, I meant, because in the old shadowrun stories it was always about glowing megacorps hunting down runners with satellite tracking and private armies. He laughed and swallowed cake. “The heat,” he said, “is a joke. They don’t even see us. We’re not on their map. Their loss-prevention algorithms flag missing pallets as ‘shrinkage’ and write it off. What’s a few cake boxes to a corporation that wastes more energy on a single board meeting than this whole Cité uses in a year?” He tapped his temple. “The perfect hit is invisible. The heat so, that was a perfect hit.”

The night rolled in, and the Mall settled into its second rhythm. The main hall became a sleeping space for those who needed it, blankets and mattresses appearing from hidden lockers. The server room hummed as someone ran a text-only role-playing game, the dungeon master typing descriptive prose that flickered onto terminals across the Cité and three other cities. A small group gathered around the expert system to discuss the purpose of existence, the AI quoting Stendhal and a 16th-century Sufi poet in alternating lines, because one of the hackers had fed it a new corpus last week and it was still digesting. The conversation was gentle, meandering, the kind of impossible luxury the system couldn't understand – the luxury of sitting in a self-made castle and wondering why you were alive, without an ad interrupting.

I climbed up to the roof, past the hydroponic bays and the water collectors, past the antennas that knitted us to the other lost boroughs. From there the Cité sprawled like a concrete reef, every balcony an extra room, every rooftop a garden or a workshop, every empty window frame a story of something repaired. It was a slum on the outside, a fortress on the inside, a place that had been rendered invisible precisely because it had nothing the mighty wanted – until the mighty realised that what we had was each other, and that was the most dangerous product of all.

Below, the Mall glowed. Someone had strung up a line of fairy lights made from scrapped fibre optics, and they pulsed in time with the data streams. The rain had stopped. A new IRC query blinked on my wrist terminal: “rigger run to Barcelona – need 4 wheels of comté and a working hard drive from before the Win10 collapse. story payment open.” I smiled. There would be another day, another cake, another perfect hit. The heat was just weather, and we had built a house that could weather anything. 
 
The Discotheque Labyrinth was as Cite as the Mall. One of the social system build houses was turned into a bachelor place full of shared flats. Young people need to meet to find love and it was also the place to let others from other towns stay and eventually settle.
The couches were leather, easy to clean, the rooms individualistic and workshops at the same time, the kitchen shared and at the weekend it accepted with wide open swinging doors all for a party. The only actually illegal drug consumed was homegrown Cannabis, the alcohol was moonshine beer and Whisky together with social engineered off the books liqueur and quite some knowledge about what was good and great for its price.

The refurbished hi-fi was synced, the entire place a large Cyberdeck, an integrated smart home with access surveillance, presence detection, music distribution and bar and kitchen management taking orders over terminals delivered by toy railways and autonomous rc car drones if too far off any kitchen with warehousing military grade source material distribution. 

The place was busy, the DJ not always present physically and the screens showed streams from other places, encrypted, covered from the corporate media world wide. French was spoken in 52 nations and broken English in every single one. A cleaner in an IBM office can create a lot of incredible stable backbone nodes that are by a hotel cleaner covered as tunneling access from guests. 
 
Incorporated with DeepSeek 
 
The cleaner at IBM La Gaude was named Soraya, and she had been invisible for seventeen years. She wore the blue polyester uniform of a contracted facility services company that changed its name and shell registration every eighteen months, and she pushed a cart that beeped with the RFID tags of the cleaning solvents the system required her to use. The cart had a false bottom, of course. Beneath the required chemicals sat a small, immaculate server blade, harvested from a skip during the data-centre retrofitting of ‘29, now purring as a mesh node with a throughput that would make a mid-sized corporation weep. It was powered by the charging bay meant for the floor-scrubbing drone. The drone never scrubbed floors; it was busy running fibre taps into the building’s unused maintenance conduits. Soraya clocked in at six, clocked out at two, and in between she wiped keyboards that cost more than her annual rent and made sure the Cité had a backbone node that IBM’s own network security team couldn’t see, because they didn’t look at cleaning staff and never would.

Three hundred kilometres north, in a hotel near the Gare du Nord, a second cleaner named Idrissa did the same work. He covered the tunnels. His hotel was a mid-range chain where corporate guests slept off their meetings and streamed their compliance-mandated wellness content. Their guest Wi-Fi was a tunnelled river that he rerouted through Soraya’s blade, then into the Cité mesh, the data wrapped in encryption layers that looked like hotel billing traffic. Idrissa had learned to code not from a school but from a text adventure someone had written about a janitor who hacked the Pentagon. When he finally met the author via IRC, they’d cried together over a character death.

The Discotheque Labyrinth sat in what the housing registry once called Block C, a twelve-storey workers’ dormitory now occupied entirely by people under thirty-five who had decided that the old world’s ideas about career and property were a dull joke. They had knocked down walls, built new ones from soundproofed server cabinets, and painted every surface in colours that didn’t have corporate names. The building was a maze of bedrooms, workshops, and common spaces that sprawled between floors like a vertical village. Tonight, the Labyrinth was alive.

---

A Friday night, the heat still radiating from the concrete even after dark. The entrance was a pair of swinging doors salvaged from a demolished theatre, their brass handles worn to a soft gleam by a thousand hands. Above them, a sign spelled DISCO LAB in hand-wired amber LEDs, the T and H having fallen off years ago and never replaced because everyone knew what it meant. The doors swung open constantly: young people from the Cité, visitors from other quarters who’d heard stories, a couple of riggers fresh from a run to Lyon, their van still ticking as it cooled in the courtyard, packed with cheese and fifty kilos of industrial chocolate powder meant for a vending-machine supply chain.

Inside, the main hall was a cavern of warmth and sound. The hi-fi system was a monster, a patchwork of salvaged amplifiers, hand-coiled speakers, and a sub-bass unit built into a former elevator shaft that made the floor vibrate in rhythm. Tonight the DJ was in Berlin, a woman called Grete who streamed her set through Soraya’s IBM blade and Idrissa’s hotel tunnel, her signal arriving with less latency than any corporate streaming service. Her face appeared on a screen above the dance floor, beside other screens showing a rooftop party in a Barcelona squat, a kitchen in Dakar where someone was frying fish and laughing, a text-only terminal scrolling poetry from a collective in Warsaw. The screens were not entertainment; they were windows into a world that refused to be isolated.

The couches were leather, real leather, scavenged from a law firm that had redecorated and thrown away furniture that still smelled of money and anxiety. Now they smelled of cannabis, homegrown in the Labyrinth’s rooftop greenhouse, a gentle haze that mixed with the yeast and malt of moonshine beer from the basement brewery, the sharp bite of home-distilled whisky, and the sweet complexity of liqueurs brewed from foraged herbs with recipes shared over IRC. Behind the bar, a young man named Théo poured drinks without touching credits, only marking orders in a shared database that tracked favours, stories, and the occasional promise to repair a washing machine.

The bar itself was a theatre of automation. Orders came in over personal terminals, typed out in broken English or rapid French, and were delivered by a network of toy trains running on suspended tracks across the ceiling, their carriages carrying glasses with unerring precision. For the harder-to-reach corners, small RC car drones, modified from military-surplus bomb-disposal bots that a rigger had pulled from a Dutch scrapyard, trundled across the floor with trays balanced on their backs, their little headlights winking. The entire building was one integrated smart home, a cyberdeck of walls and wires. Presence sensors tracked occupancy to dim lights in empty rooms. The music followed you from hall to lounge, seamless. Surveillance cams – all open-source, all monitored by the community, not some distant security firm – kept the place safe, their feeds encrypted and shared only with those who lived there.

In a corner, a group of newcomers from the outer banlieues sat wide-eyed, watching a terminal stream a conversation in English between a woman in Lagos and a man in Manila, discussing the best way to solder a capacitor onto a drone controller. One of them, a girl of maybe seventeen with tired eyes, turned to her companion and whispered, not quite believing: “They’re just... sharing it. For free.” The companion, a boy with calloused hands from a warehouse job, nodded slowly. This was worse than Communism indeed – it was generosity without expectation, and it broke something inside him in the best way.

The Labyrinth was not just a party. It was a node of the underground, a place where riggers could sleep between runs, where a young woman from the Cité might meet a traveller from Turin over a glass of contraband liqueur and end up married in a ceremony performed via text terminal with guests from six time zones. The private rooms were individualistic caves, each a workshop and a sanctuary: one full of half-assembled drone parts, another a library of paper books rescued from the pulping machines, a third a darkroom for film photography because someone had found a cache of undeveloped film in a demolished photo lab and decided to learn the whole art from scratch. Love happened here, of course. It happened in the quieter hours, on those leather couches, in conversations that lasted until the sun came up and the automated bird-feeders on the roof clicked on.

Tonight, a rumour spread through the crowd: a new run was being planned. Not a small run for cake or capacitors, but a big one. A rigger from the north, a woman with a shaved head and a tattoo of a circuit diagram running down her arm, was looking for a team. She needed someone who knew the insides of a pharmaceutical warehouse, someone who could drive a refrigerated truck, someone who could sweet-talk a loading-dock system with a custom script. The target wasn’t drugs, though; it was insulin. A whole pallet of expired, still-perfectly-viable insulin that a corporation had written off because the “Best Before” date had passed by twelve hours. The rigger had found it through a data leak a cleaner in Frankfurt had spotted in a trash report. The insulin was destined for a community in Oran that couldn’t access the corporate supply chain without paying a year’s wages for a single vial. The pay, as always, was not credits. It was cheese, stories, and the knowledge that somewhere a diabetic child would live.

In the kitchen, Fatou from the Mall had set up a temporary station, reheating leftover cake from Louis’s run and serving it alongside a stew made from vegetables the rooftop gardens couldn’t store. She was arguing amiably with the Labyrinth’s main cook, a wiry man called Samir who insisted that his grandmother’s couscous recipe could be improved with a touch of the Dutch cheese Louis had brought. They were both wrong, but the result was delicious, and it fed eighty people before midnight.

The screens above the bar flickered and, for a moment, showed a feed from the outside world: a corporate news bulletin about a new trade agreement, a smiling CEO shaking hands with a politician, a graph illustrating quarterly growth. Someone booed, and someone else threw a cushion at the screen. The feed was replaced by the Warsaw poetry terminal, and the collective sigh of relief was palpable. The outside world was a fiction they had all agreed to stop believing.

As the night deepened, the party shifted. The music slowed, became something more introspective. A young woman plugged a synth she had built herself into the sound system and played a melody that wandered like a question. Couples found corners. The trains kept running, delivering glasses of water alongside the last of the moonshine, because the system cared for its own. The surveillance showed a quiet perimeter: nothing but the occasional fox, the distant lights of the périphérique, the steady blink of the antennas on the roof.

In one of the upper rooms, a group of the original Labyrinth builders sat around a table, drinking whisky and talking about the early days. The days before the mesh, before the runs, when it was just a few of them squatting in an abandoned block and dreaming. One of them, a woman with grey streaking her hair now, remembered the first cake delivery, the first time someone had said “cheese is more important than credits” and meant it. Another recalled the first text adventure they’d written together, a sprawling thing that had taken six months and ended with a line that still made him cry. They were the elders, all of thirty-five, and they looked at the young faces dancing below with a fierce, tired pride.

The Discotheque Labyrinth was a castle, too. Not of steel and glass like the Mall, but of sound and light and the stubborn, radiant belief that a different world was already here, tucked inside the old one like a secret. The system saw nothing here worth taxing, worth policing, worth noticing. And that, of course, was the perfect hit. The heat so, that was a perfect hit.

---

Near dawn, the doors swung open one last time. A rigger from Amsterdam walked in, exhausted, carrying a box of cheese and a hard drive full of Dutch literature he’d digitised from a library that had closed. He was greeted by name, handed a bowl of stew, and shown to a couch. He fell asleep within minutes, his boots still on, a smile on his face. The trains trundled softly on their tracks, the screens glowed, the mesh hummed. Outside, the city slept its medicated sleep, and the Cité, invisible and magnificent, was just beginning another day.