I really believe that for commercial cellular in urban areas, 5G is more of a benefit for the carriers than the consumers. While low band spectrum (850MHz) has decent penetrating power, the majority of 5G is deployed at ~5GHz and mm-Wave frequencies, where you won't get much of any signal through a wall, or even on the other side of the building. This is why so many 5G base stations have to be deployed; the coverage just isn't as good. But the improvements in base station handoffs and spectral efficiency will increase a carrier's total capacity, even if the end user doesn't see a massive benefit.
There are scenarios where you might see massive speeds, though. 5G heavily leverages Carrier Aggregation, where a client can connect to multiple channels at once, increasing bandwidth, and beam-forming, where the "shape" of the signal is modified to steer maximum power to the physical location of a client. If you happen to be near a base station with good Signal to Noise Ratio while downloading something large, the base station can allocate multiple channels and use beamforming to finish your download at Gbps speeds and get you off the network faster, freeing up spectrum time for more users at lower SNR's and bandwidths.
There is potential for much more impactful deployment of 5G cellular in rural areas where there aren't many buildings or walls blocking high frequency signal. The same goes for point-to-point links such as inter-building WAN or low latency self driving car to self driving car communication.
>> use beamforming to finish your download at Gbps speeds and get you off the network faster
Ya, "Faster speeds mean people will spend less time clogging the connection" ... it never happens that way. Increased speeds mean people use more internet, not less. Consumers start doing new things with the faster speeds (streaming video etc) increasing demand and overall usage. The only real answer is massive and constantly increasing speeds and bandwidth capacities. It is a race that cannot be won but must be run nevertheless.
How much cellular data people use is dictated far more by data caps and usurious overage charges than by the speed of their connection. With overage charges of >$100 a GB in some parts of the world, usage per person isn't going to increase just because speeds go up.
Unless the carriers have a sudden outbreak of generosity, data caps won't rise under 5G, so usage per person will stay the same but improved efficiency will allow the carriers to extract more profits from more people per cell simultaneously.
5G doesn't solve any problems for end-users. It solves the carriers' problem of 'how can we get more people to use up their data cap faster.'
Data caps seem to have risen pretty fast recently under 4G, and the prices of unlimited data plans have come down. You can get unlimited 5G data plans, on a good network, for under £30 per month here in the UK. Even cheaper on a crappy network!
Of course, how fast and aggressively the prices come down depends on the competitive environment in your country. But it will happen.
In some parts of the world it's very normal to have unlimited data without slowdowns. Some months I have downloaded around 200-400GB on a normal 40$/month cellular plan with no extra charges.
> Unless the carriers have a sudden outbreak of generosity, data caps won't rise under 5G,
Verizon's $80/month plan (for individuals) right now has unlimited 5G, plus 50GB of 4G LTE before you hit the soft cap. When 4G LTE was first introduced, the price was $80 for 10GB of 4G LTE. That's a factor of 5 improvement in less than a decade. Likewise, I can get 150 mbps+ on XLTE out here in the 'burbs (where Verizon's network really shines). Back in 2011, 5-10 mbps would have been great speeds.
What else in the computer industry has improved by that much over the last decade? Maybe multi-core performance of Apple's phone CPUs? (7x jump from 2013 to today.) Certainly nothing in the desktop or laptop realm. A 2019 iMac with 9th generation core CPU is only about twice as fast (multicore) than a 2011 iMac with 3rd generation core CPU.
EDIT: The $45/month is lines on a family plan. Corrected to compare $80/month for an individual plan.
> What else in the computer industry has improved by a factor of 10 over the last decade?
SSDs were still SATA in 2010 and capped at 500GB/s (and in practice were slower than that). High-end SSDs today are maxing out PCIe 4.0 x4 for 6000+ GB/s.
GPUs probably have a 10x increase in speed. AMD 5870 and GTX 200 series... yeah... that's way old.
Hard drive capacity is probably 10x bigger, maybe.
CPUs and RAM haven't improved much. Most things seem to be getting faster though.
The same sense as in 'everyone should have a pony, alas we live in the real world'? Or the sense of 'no one shall offer anyone an internet connection unless they can satisfy these arbitrary requirements I just made up'?
I can sign up for the former. The latter would just lead to fewer people getting into the business.
Neither. As in, transfer costs are vastly overstated to laypeople in the interest of profits, and that bites consumers in the form of bandwidth caps among other things.
Note that I never said payment models need to stick to a fixed monthly price. Metered connections (at a small multiple above cost) are an option.
What if the power company cut you off or greatly reduced your electricity after your consumption hit top N% of users? Why should the internet be much different?
Maybe you live in a different part of the state but here they are arbitrarily choosing sectors to shut off during peak hours. For all they tell us they could be chosen at random.
> What if the power company cut you off or greatly reduced your electricity after your consumption hit top N% of users? Why should the internet be much different?
If the contract I entered with the power company allows them to do that (and I presumably got a cheaper deal on my electricity in return), that's all fair.
That's not what I meant. With higher SNR you can achieve much better spectral efficiency, so it makes more sense for spectrum-time use to allocate the most bandwidth to high SNR users as they need it. It's not a constant or necessarily noticeable thing. My example was an extreme one for illustrative purposes. It could just as well be downloading 1MB of web content at 300Mb/s.
I do think there are some reasonable limits to look forward to WRT average bandwidth consumption by a typical user.
For instance, there is a maximum display resolution after which point it does not make any sense to continue adding pixels, color depth or redraws per second. Most would successfully argue that we have hit that point years ago. For such displays, there is a constant maximum amount of video information that could be presented per unit time. This is somewhere in the realm of 10~1000mbps.
Audio is another good example where we've decided that ~1.5 mbps for uncompressed stereo audio is about as much as you'd ever really need.
Beyond this, there isn't a whole lot else that consumes substantial bandwidth per device. Assuming all gaming would be streaming over such incredible networks and A/V technologies, there is no need to worry about archaic things such as application binary, texture or other asset sizes.
And don't forget that these assets are being downloaded from services that have to pay to store them and then pay for upload bandwidth to send them to you.
If you browse to an imgur hosted image on a mobile device, you'll find that the quality is visibly bad. And it's not because the client didn't have enough bandwidth.
Streaming video is the most intense use you'd see these days. Once you hit several mbps more bandwidth doesn't improve it, so yes you would get each user off the radio faster.
You'll get more users if the network is less congested, but that's a good thing.
> Increased speeds mean people use more internet, not less.
Do you have reports that show this? I've read a recent isp report that show as people moved to gigabit fiber, their usage didn't go up as had been feared. The report showed that beyond a certain number, something like 100mb/s, usage no longer went up as bandwidth increased. Additionally, infrastructure costs went significantly down for the ISP showing that moving to fiber everywhere makes good financial sense for them.
This isn't surprising at all. We see this in computing as well. When designing clusters, other network specs such a latency and jitter become the bottleneck above certain speeds, depending on the application.
> Increased speeds mean people use more internet, not less.
Do keep in mind that mobile devices have bottlenecks other than Internet speed. Eg limited battery. Or just plain old limited monthly data volume.
> The only real answer is massive and constantly increasing speeds and bandwidth capacities. It is a race that cannot be won but must be run nevertheless.
The victory is in giving people more Internet. They like that.
Quite a few. Though to be honest, 4k is currently good enough for me, resolution wise.
But: even the highest bitrates you get on eg YouTube ain't good enough to show certain scenes nicely. Especially when snow is involved. That's because the video compression algorithms we have don't cope well with that.
> "mobile networks have relatively low usage limits."
That depends on the country/network/plan. I have unlimited 5G data and get consistent speeds in the 200-300 Mbps range, sometimes higher. Some months I will use 1 TB of data or more. Streaming games on Stadia
uses a lot of bandwidth!
Yeah, it's not like peoples' traffic profiles didn't completely change during that period. (Going from bursty web surfing and downloads to everyone pulling down 25 mbps 4K streams continuously at the same time.)
5G is a racket designed to replace last mile infrastructure and get the telcos away from State Utility Commission regulation in favor of more pliable federal regulatory regimes.
Wireless is exempt from the restrictions that wired services are forced to accept. Wireless carriers can terminate apps that they consider disruptive and have fewer neutrality and limit restrictions.
You’re not going to have meaningful competition, and the national security nonsense attributed to 5G allows for things like siting infrastructure dramatically cheaper than incumbents.
The regulation is much easier for carriers to control at the federal level, especially with the courts in the state they are in.
Regulation is a key part of the cost of goods sold. Regulation makes markets.
For example, 5G antennas have been declared critical defense infrastructure, cable TV has not. That means that your cable company needs to follow local ordinances, get permits, rent existing pole space, etc. The 5G people can drop an antenna wherever they want with 30 days notice — on my block, a 5G millimeter wave pole was dropped three feet away from an existing telephone pole, in the middle of a sidewalk.
Wireless also operates under rules that allow it to do more revenue generation for things like prioritization, etc.
Its more a way for the same companies to not have to actually physically install wires / fiber to a residence. It just means people will end up with a single carrier of 5G instead of a single fixed broadband provider. With no additional competition.
They’ll just quit maintaining the wireline infrastructure until it’s so bad no one wants to use it. Where I’m from they cut out the copper infrastructure when they install FTTH. They need to share infrastructure where it exists, but I’m guessing there’s nothing that bans the “owner” from tearing out the shared infrastructure.
Bingo. Already happened on the street behind my house. AT&T has copper where most of the pairs have been damaged over the decades and they just shrug about it.
One would think, but that does not seem to be the reality thus far. Once one of those markets can be fully substituted with the other, then we can see if they truly compete, or if it somehow forms yet another abusive oligopoly
Meaning that they're both roughly equal in terms of latency and bandwidth, so much that one could swap one out with the other and not know the difference
Because I do not see them as strongly competing technologies. Cable has sheer speed and latency on its side and 5G has mobility. Switching my cable connection today for a 5G connection tomorrow would make my life worse. They both fill niches the other can't.
I think it's all a sliding scale, not a binary either/or.
Of course, the better they substitute, the more competition there is. But in the grand scheme of things, even eg email and telephone and driving and sending a letter compete with each other to a certain extent.
So eg for watching a movie, latency ain't that important.
I live in Singapore, and my options right now are 4G and fibre.
In practice, 4G is regularly faster and lower latency for me around the flat, but that's just because I usually cripple my fibre connection by using it via wifi. And our 4G is usually more than competitive with wifi.
This sounds like an issue with your WiFi. Properly configured, a modern 802.11ac (or better, 802.11ax) WiFi router should be much faster than the fastest 4G or even 5G speeds.
We have up to 1Gbps 4G speeds in the city centre. But I just got 68 Mbps on the iPhone indoors (via fast.com).
The wifi router is just the cheap one that the ISP gave us. When I went next to it just now, it reached 330Mbps (also on fast.com via the iPhone's wifi).
Why main complaint is the latency of the wifi when we are in a different part of the flat, ie through a wall. I suspect there might be some packet loss and then tcp adds a few round-trips to resend packages and the congestion control reduces the amount of data sent or something like that.
Yes, I get pretty similar indoor 4G speeds on my iPhone X here in the UK. And I also saw exactly 330 Mbps via WiFi to my 5G router. My MacBook is a little faster, 430 Mbps.
I have heard of these claims of 1Gbps+ massive MIMO LTE in Singapore, but is it really achievable in the real world or just a "theoretical" speed? There are impressive speed claims for 5G here in the UK, but 500 Mbps or so seems to be the real-world peak for most current devices.
When I connect my laptop via copper cable to our fibre modem, I get just shy of the Gigabit.
Only a few handsets can achieve the Gigabit LTE. I don't think any iPhone is amongst them. (It is after all predominately an American device. And what good would that capability do you in the US?)
The Gigabit LTE is useful in practice not so much for any sustained Gigabit speeds, but because it allows your phones to 'race to idle'. Most traffic is bursty, and higher peak speeds mean your phone transmits for a smaller proportion of time. Leaving more time for other phones on the network, and perhaps helping with battery conservation?
or cartels, or price fixing, or vertical monopolies, or industry consolidation, or effective monopolies ... when you let a thousand flowers bloom, unless you're careful usually all you get are weeds.
I'm not sure what you mean by only thought experiments. There's lots of history of cartels falling apart, monopolies being broken; or very often also: monopolies being upheld by the government, and sometimes supposed anti-trust law being used to keep out the competition.
Lots of the laws mentioned in the Wikipedia article you linked to are pretty anticompetitive. Eg
> Under Diocletian, in 301 AD an Edict on maximum prices established a death penalty for anyone violating a tariff system, for example by buying up, concealing or contriving the scarcity of everyday goods.
Or just next:
> It provided for property confiscation and banishment for any trade combinations or joint action of monopolies private or granted by the Emperor. Zeno rescinded all previously granted exclusive rights.[4] Justinian I also introduced legislation not long after to pay officials to manage state monopolies.
Or George Selgin's Good Money (https://www.goodreads.com/book/show/3392302-good-money) about an episode in the English Industrial Revolution when the private sector minted coins because the government monopoly was inept.
> "data caps and data throttling means they don't actually directly compete"
That depends on where you're located. In the UK, unlimited data plans are common on 4G/5G, and mobile carriers compete directly for home broadband customers. For example:
In my case, since there's no fibre installed into my building, 5G wireless with unlimited data (no caps, no throttling) works out both slightly cheaper and much faster than any available wired option.
In the US, the FCC rules that govern telecommunications treat wired differently than wireless.
Congress doesn’t allow states to regulate wireless services in the manner that other utilities are regulated. Regulatory teeth really only exists at the state level.
In most of the world, most of the time retail prices for bread are allowed to vary, and a bakery can offer a loaf of bread for 50 dollars or for 10 cents without issue.
You are right that there's lots of regulations and subsidies and taxes in the whole production chain of bread.
So if you want a more sophisticated version of my toy argument, then let's argue based on sliding scales, not binary distinctions:
There's lots of different products in different industries in different countries, and larger amounts of regulation don't seem to correlate with lower prices or better quality. If anything, it's just the opposite.
See especially black markets that are by definition at most lightly regulated and spring up every time the regulations on the formal markets go awry. (I don't want to so 'every time regulations become too much' because it's not about quantity so much as it is about sensible rules.)
When using MM wave, inexpensive repeaters can be used to extend the reach of base stations around corners and into buildings efficiently. On the flip side, the reduction of cell size facilitated by lower interference allows greater reuse of spectrum.
Marketing but worthwhile background reading material:
Stadia is probably the main one for me. It'd work at 50 Mbps, but 4G speeds can vary (not consistently 50 Mbps at peak times), so it wouldn't always be smooth and lag-free at 1080p.
But in general, it's just nice to have faster downloads. Nice not to have to wait as long for those multi-GB OS updates, game updates, app updates, etc. Don't you agree?
Except for T-Mobile which is using both 600MHz and 2.5GHz spectrum for 5G. The 2.5 spectrum aquired when they bought Sprint has just the right balance of bandwidth (300Mbps avg) and ability to penetrate walls and travel large distances. They do have to spend a lot of Capex to increase backhaul to the cell sites to make it happen though.
Supposedly it will be better able to support large amounts of users in the same location (like sports stadiums), so that's a slight advantage if you go to football games or concerts a lot I guess?
More mystery as to future weather conditions! (not sarcastic, I dislike the idea that we should know the future of weather, though I grant that this is potentially a downside for managing responses to life threatening weather events like hurricanes)
> I dislike the idea that we should know the future of weather
Whaaaaaat? Is this some aesthetic preference, or a "humans weren't meant to share this knowledge of the divine", or are you trying to sell more weather resistant clothing? And if you wish to practice weather agnosticism, why is your preference so strong that you view the overall human lack of weather-knowledge positively? What do you think of meteorologists? I have so many questions...
There isn’t a legal means for that to happen as long as it’s possible to profitably market to potential new customers, pricing innovation falling under that category.
T-Mobile has been doing mmwave on 2 different bands for over a year now in various cities, airports, venues[1][2]. The difference between T-Mobile and Verizon was the launch strategy. T-Mobile played it right by not hyping technologies most of the country can't use like LAA (5GHZ) and mmwave. Verizon arguably had the best LTE launch strategy years ago, but yeah... I guess they could afford to botch this.
Come to think of it, pre-merger Sprint probably had the best launch strategy which was to launch 5G on mid-band where they had enough spectrum to service both LTE and 5G NR users. Masayoshi Son was too busy flushing his money down other toilets like WeWork and Uber to cut the checks Sprint needed to get it completed nationwide. Sprint also suffered from having 5 or 6 different equipment vendors across the country... where T-Mobile had two. They have been ripping out the NSN 5G equipment from NYC post-merger, because this is a T-Mobile Ericsson market and they don't want to deal with it.
AT&T meanwhile got a $100 billion 25yr contract from the post-9/11 telecom pork project, FirstNet[2]. So they've been throwing up tons of new sites with that money over the past year or two. I don't think anyone was thinking about 5G or spectrum planning over there, just focusing on acquiring consumers through acquisitions (Directtv, some crappy Mexican carriers) and getting government contracts. Gov M2M stuff moves slower than Apple at acquiring new modem technologies for their devices, so there's not a huge rush to ramp up 5G.
Walking around a big city leaves a lot to be desired with 4G. There are constant dead zones, bad hand-offs, poorly configured repeaters, and endless cement/concrete. I am very much looking forward to 5G foxing this.
I’m not worried about in-building coverage. Most buildings have WiFi. I’m talking about walking block to block. If many more blocks have antennas, then my problem is solved.
Yeah, my 4G is fairly consistent and reliable whereas I just don't bother with WiFi if I can possibly avoid it. It's a hugely frustrating technology I think.
I'm constantly amazed at how fast my 4G/LTE service is.
I had to run my house off of my tethered phone the other day, and browsing seemed faster than cable -- obviously slower overall bandwidth, but good enough and very low latency.
I have no desire for my phone to be faster in either processing power or bandwidth -- it would give me absolutely no benefit.
I wish 5G, 8K video (and even 4k in many cases), etc., were not thrust upon consumers. Things are already good enough to be close to magic.
4K video is awesome. Probably wasted on a phone, but I can instantly see the difference between 1080p and 4K on anything larger.
I'm pretty sure that it's right on the diminishing returns margin, however. I don't have an 8K resolution device to try it out on, but that's because everything I use is in the 'retina' zone of pixel density, I never see a genuine pixel under normal circumstances. (Actually my TV isn't 4K but if I cared at all it would be).
For what I use my phone for, the only thing which could improve in a way I would notice is the camera. That won't be true for everyone, games in particular have an essentially unlimited appetite for specs, but this is just to say that the state of the art as of four years ago is still just fine by me.
The biggest difference with 4K IMO isn't the resolution itself, but the features that typically come with 4K content. HDR and 10-bit colour especially make for a noticeably better picture than the pixels ever will.
Based on back of the envelope math I did based on known ideal viewing distances, most TVs are too far away to be able to get all the benefit out of even 4K. Something between 4K and 1080P would be the “ideal” balance between bandwidth and resolution.
For monitors, the maximum resolution is a bit behind 4K (since monitors are closer than TVs), and I suspect that’s why Apple offers a 5K iMac.
IIRC there were 4k upscaling demos but I'm past the point of pixel resolution, my interests are not in pristine visuals but a blend of surprises in content and composition.
> But the technology to add missing detail will never exist; we must be content with a realistic facsimile.
You are right in a sense. But there are some details that are in the pixels but that your eyes can't see. You can upscale to make them visible to the eyes.
I normally browse the comments on certain channels while watching, and with the video taking up only the upper third of the screen, I can’t tell a real difference between 480 and 1080.
If I flip to landscape though it can be noticeably, significantly worse.
I bet we haven't even come close to realizing the potential of having extremely fast/high bandwidth cellular connections. Streaming 8k video to a phone seems silly but there are some brilliant and clever people out there who will find a use for that kind of mobile bandwith and apply it in ways you or I never thought of.
Bill Gates infamously said sometime in the 80's "640K is more memory than anyone will ever need" , and for the scope of computer applications at the time he was correct.
But (stipulating the attribution is correct) was Bill right?? Okay 640K is a little too far, but have we gotten 25,000 times more value out of the ~25,000 times more memory we now have? One could argue that we could have made much more ingenious use of, let's say, 1/100 of the memory and CPU we now have.
We are so far at the end of the value-add curve, that we're running super-computers so we can clear up our complexion or make ourselves look like mice or potatoes in real-time video -- it's absurd.
> One could argue that we could have made much more ingenious use of, let's say, 1/100 of the memory and CPU we now have.
The counter argument is that this is over-, premature, or unnecessary optimization. If we can put 2GB of memory in a computer for $20, why spend millions on R&D that will be rewritten to make it work on 128 MB?
There is also an environmental argument for your point, but everything will always be a tradeoff to companies trying to build and sell devices quickly.
Why would we expect value to scale linearly with memory and transistor count? I would in fact expect it to be logarithmic: each doubling would lead to a roughly linear increase in cool stuff we can do.
I think that's what we see. Yes, our computers could be radically better experiences if it weren't for the bad habit of squandering improvements with layers of unnecessary abstraction and laziness. But the only thing standing in our way is those unnecessary and that laziness, so maybe we should stop doing that, now that real gains in speed are slowing down.
I think in one of his AMAs on reddit he confirmed that he never said it and no software developer during those times would say such a thing that 640K is more than enough.
Yep, when implementing a simple OS for a class, it is clear that the 640KB limitation was imposed by the hardware at the time, not the DOS. So it wouldn't even make sense for him to say it, unless it was more akin to "it's enough for us".
I don't know who said it, but it was in one book I had back in the day. That is a book from the era where 640k was something big companies might splurge on but only for systems that proved a need.
Yeah, VR comes to mind. But also speed gets reduced even with a small deterioration in signal. If you're getting 10-20mbps at full strength, you might get 1-3 Mbps with a weak signal. I would presume that 5G would be faster in weaker signal scenarios.
Yeah, with 1 bar of signal strength though? I work remote as a nomad and I frequently have to depend on 4G for connectivity. I don't get anywhere near 200 mbps in most parts of the U.S. I'm lucky to get 20, and often limited to the low single digits.
In 2011-2012, when 4G started becoming mainstream, did you make similar claims? For example, did you say something in the lines of “why would I want to watch videos on my phone? 3G is more than enough for the browsing I am doing now?”
4G might be good enough for you right now, but your needs will increase, way more and way faster than you expect. Remember this the next time you try to do a video call / watch an HD video / etc. If we had stayed in 3G, you would not be able to do any of these, but when 4G was introduced you did not feel any need for them.
Idk, I still don't know why anyone would want to watch videos on their phone, and I have a 4G phone capable of showing videos... (watching videos on other devices is way better).
I mean, since the invention of semi trucks, you could technically can send me ~40 tons of postcards to me all in one shipment, which would be impossible in the days where postcards were sent via a guy with a satchel on horseback, but then why aren't you sending me 40 tons of postcards right now? Why would you even want to?
Don't get me wrong, semi trucks certainly have it's uses and has enabled many things, so would 5G provided it does something better than 4G, but I can see why some people would go "Why would you want more of that?" Not everyone has the same wants and needs.
These days, latency seems more problematic to me than bandwidth. Unfortunately though, short of optimizing networks, and reducing the distance between server and client, there's not much that can be done for latency...
No, I did not say that - 3G was slow enough to be bothersome for simple tasks. Not everything is relative. There are absolutes, and there are diminishing returns on investment, and a differences between creating tools that solve problems and creating infrastructure in search of problems. But I suppose I should be happy for anything that drives growth, or even appears that it may, at this point.
> but your needs will increase, way more and way faster than you expect.
Forever? Is there no bandwidth so great that most users would be happy with it? Can new uses for extra bandwidth always be found, and do at least some of those uses always significantly improve the user experience?
> Remember this the next time you try to do a video call / watch an HD video / etc. If we had stayed in 3G, you would not be able to do any of these
I wouldn't? News to me! I just disabled 4G on my phone, which fell back to HSPA+ with a fast.com result of about 7mbps. I then successfully tested a Zoom test meeting with video (worked fine) and watched a YouTube video at 1080p (buffered for three seconds at the start, noticeably longer than usual, but then played fine).
> Forever? Is there no bandwidth so great that most users would be happy with it? Can new uses for extra bandwidth always be found, and do at least some of those uses always significantly improve the user experience?
The fact is that every year mobile data consumption is increasing significantly and it is expected to continue this trend. I do not have an opinion on whether more data equals better user experience.
> I just disabled 4G on my phone, which fell back to HSPA+ with a fast.com result of about 7mbps. I then successfully tested a Zoom test meeting with video (worked fine) and watched a YouTube video at 1080p (buffered for three seconds at the start, noticeably longer than usual, but then played fine).
Without knowing the context of your set up, I can only guess. I would say that not that many people use 3G in your area, so you used an uncongested cell.
Processing power on phones is useful with video editing and AR. Social media is moving towards videos instead of photos. Lot of it can be attributed to the capabilities of current phones.
We'll hit the wall on the technology eventually, just as we have for desktop computing. A 10-year old desktop is still perfectly serviceable. Imagine saying that in 2000.
Is 5G like 4G where it’s more spectrally efficient? So even if 5G is slower because it’s allocated fewer MHz, in aggregate, the network’s bandwidth is increased?
While 20mbps vs 100mbps is nice to have the luxury of comparing on mobile, my bigger question is how many gb you can consume as a user before getting throttled.
Canada has speedy mobile networks... and $5-$10/gb pricing. No wonder why it’s fast.
Just ran this test from my iPhone 7^. 60GB/mo 4G plan, 15€/mo. 115 Mbps down, 30 up, barely takes a dent inside buildings. 100GB plans are available.
I’m looking at 5G developments with bewildered amusement :popcorn: (especially the beyond ridiculous marketing side) and honestly couldn’t care less right now.
^ mentioning iPhone 7 because I had a 8 that stupidly went MIA, and the 7’s connection definitely is “lousier” that the 8 was, which was able to maintain a solid connection (obviously not at the speeds above, but continuously usable) on a 350kph TGV.
In Italy I pay 5€/month for 4G 40GB/month (WindTre operator). I tested this morning in Treviso, near Venice, (I was working remotely), download speed was about 40Mbit/sec. Vodafone is much faster but a bit more expensive.
In France I signed up for an exclusive offer with Free Mobile operator where 100GB/month (in France) for 1 year only cost 1 EUR per month, and then would go back to 19.99 EUR per month. After 11 months, the company shat itself they would lose a ton of customers, so decided to give a 10 EUR a month discount... indefinitely. The kicker is that their roaming deal in the US is amazing: 25GB per month at HSPA+ speed. So now I pretty much have a $11/month bill for my data plan in the US. I use Google Voice for people to contact me. Best deal ever.
Apart from the food, the thing I miss most from France. You can fibre hop indefinitly, because at any point in time someone's offering 100mbit or more at 10eur/mo. Currently 30 is the cheapest I can get fiber, at a measly 50mbit.
Just HSPA+ speed (I think). In any case they wouldn't just remove the service, they'd transfer to another protocol. For the life of me I can't understand why anyone would need more speed. It's not like I need to download a 3GB game in X seconds in a spot where I don't have a wifi. This whole LTE and 5G is marketing bullshit. No one needs that much speed. IMHO.
In the US with ATT I currently pay $90/month (15.2x more!) for "5Ge" (fake 5G, which is actually 4G). At home, with not quite full bars, I have 87Mbps down, 3.9Mbps up (higher than article says average for ATT 5G). The plan is "unlimited" data, but you lose priority if you use more than 22.5GB/month. I've went above that limit a few times but was never throttled (so that's good!). I guess another silver lining is that the plan works in similar unlimited fashion in Mexico/Canada.
A big problem is the marketing around 5G is a total clusterfuck, to the great detriment of consumers.
As I understand it, the "true" original definition of 5G was millimeter wave technology. From the PCMag article this is based on:
> Millimeter-wave uses very weak, short-range panels that are easily blocked by obstacles. In our tests, millimeter-wave doesn't generally penetrate buildings, and even has trouble with glass; we had our drivers keep their windows down, with the phones facing out, so the network even had a chance.
That is, this kind of 5G is essentially useless now, and will be for a long time until buildings/cars/subways etc. are built with repeaters that would make indoor coverage possible.
However, I definitely don't understand all the various versions of "5G" so it's difficult for me to understand which tech is used by which phones and carriers.
Sub-6 is about beam-forming — instead of sending your signal in all directions, it sends only in the direction of your device (and visa-versa which is why antenna design is a larger cost on mobile than before). Now the base station can use the same bandwidth for another device located in a different direction. Over time this will get better and have compounding effects.
And that’s just one item in the basket. mmWave is another. There are several.
LTE already has a fair bit of beam-forming standard, 5G dramatically increases the number of antennas used to do so (allowing you to beam-form more efficiently /precisely)
Another bucket is coding schemes: LTE uses Turbo and Convolutional Coding, whereas 5G adds Polar and LDPC codes to the mix which bring us closer to the Shannon Limit (theoretical maximum data capacity given a certain SNR).
On the tower side they've been using multiple directional antennas per base station since the 2G days. Beamforming is only going to help on the mobile side.
5G marketing is concentrating on new high-band, because first 5G installations are deployed into new high-band.
5G spectrum covers also low-band and-mid band spectrum from 1G through 4G LTE frequencies.
5G installations in the countryside will have low-band that is more efficient than 4G/LTE. You can have _less_ base stations than LTE, not more in the low-band. 5G NR is more efficient radio interface in all bandwidths.
5g has several parts. If used on existing frequencies, I believe it's more efficient as you suggested. Which will be a clear win, once there's enough deployed equipment (base stations and mobile devices)
It also is available on new frequencies. Lots of bandwidth on those frequencies, but a lot less signal propagation too. Helpful for sports stadiums, transit terminals, and other crowded situations, not so helpful for most situations.
Carrier aggregation sounds like it could be useful too.
I had an amazing experience with 5G home broadband in the UK...while it lasted. I paid £29/month and always got the promised 300+mbps (sometimes up to 500-600) down with ~10ms ping. It was 100% rock solid and seemed to be truly unlimited, e.g. in the first month I downloaded >500GB and never noticed any throttling whatsoever.
Then they took down the tower nearest to my flat for "improvements" a couple months into the pandemic and from then on it basically didn't work at all (it was supposed to switch to 4G and it theoretically did but the best I got with that was an extremely flaky 10mbps). They originally said it'd be fixed in a week but it was completely unusable for at least a month. So now I'm back to a cable connection.
I still miss the 5G though -- there was something pretty amazing about just plugging it into the mains and suddenly having super-fast wifi with 0 installation. I personally remain optimistic about its future, unlike pretty much everyone else in this thread I guess.
I remember this being the case on the dawn of each new
cellular technology. Edge vs 3g, HSPA(+) vs 4g, etc. It'll get better over time. That said, I am purposely avoiding 5g until that time. No sense in paying extra for the phone and service for features I -don't- want to use.
4G Was amazing before everyone switched and it got overloaded. I remember being astonished to get better download speeds on my phone than my wired home internet when it first came out.
Yeah, power usage is going to be a hard one. One of the biggest things to drain battery, moreso than screen time even, is a super weak cell signal. I am curious how they are going to mitigate this effect, especially with mid band and mmwave 5g. I don't want my phone grabbing onto weak 5g and burning up when I'd be perfectly happy with existing low band LTE.
I'm looking for someone to explain why 5G is important for the average user. In my case, I can stream video with no problem, web pages load quickly, new apps download fast.
The primary use case I can think of is to replace traditional cable broadband with it, but that would be impractically expensive given the cost of mobile data.
Way back in the day 3G was supposed to solve every issue with mobile data. Then we had to have 4G because it was faster. This is just another step up. Just like when you got high speed internet. At some point why does someone need gigabyte speeds when a much slower speed will allow you to do 95% of what any normal user would need?
I think wireless data speeds is the new arms race tbh. The interesting thing is Verizon is not upselling you for 5G and AT&T is. Plus AT&T was sued saying they had 5G in areas they really didn't and now this article claiming their speeds in certain cities are faster than they actually are.
The real question is, how many people will switch because of this issue and over selling it on their part?
Early versions of 5G, known as “non standalone”, are just glorified 4G. With the exception of the radio link, everything else is 4G. This means that the value proposition is that more people per base station (or per surface area) will be able to stream video, download fast etc.
When stand alone 5G becomes available, the latency of the network will become comparable to the latency of wired networks, especially if it is combined with mobile edge computing. Then, you will have access to new services that are impossible now, such as game steaming and VR.
I agree though that 5G is way more important for the public sector and the industry than the average consumer.
I can do game streaming now without an issue @ 1080p. Even latency isn't bad @ 25ms to 30ms ping time. Probably not what hard core twitch gamers want. 5g isn't performing much better yet, Ookla shows ping times about the same, but it is supposed to come down to about 10ms (1ms in theoretically optimal circumstanc. read: never for actual users)
Streaming VR would be a valid use though, the resolutions there are much higher.
The problem is that data caps will severely limit this. I can play an hour of streaming 1080p gaming before eating through almost 20% of my data cap. I usually tone it down to 720p. Data caps haven't actually improved that much over the years either: Now I pay $20/month with Mint for 8Gb. The supposedly "unlimited" plans from Verizon start at $70 and have an effective cap of 15Gb. Compare this to around 5 years ago where things were around 5Gb, if memory serves (and mint didn't exist).
The backbone network capacity that mobile networks tap into isn't going to get any cheaper for providers, even if 5G signals bouncing around their own network have higher bandwidth, so I don't know how much 5G will actually reduce mobile providers' costs per unit of bandwidth if it significantly alters the peering ratio of backbone traffic interchange.
Given this, I skeptical that new high-bandwidth applications will be able to get much traction, at least not before 3-5 years of additional product maturity & capital costs to providers for the upgraded hardware have been covered and they can gradually lower costs toward the marginal cost of production.
From the IoT side, it will be possible to have sensors everywhere in the city. So, you can have services like smart parking, monitoring of pollution at the city block level, etc.
With the introduction of slicing, it will be possible to have specific SIM cards as “first class citizens”, with complete priority over everything in the network, while also being secure. These SIMs can become “first class citizens” dynamically. So for example devices that belong to firefighters can become so, during an emergency. This enables ordinary devices to meet the high SLAs needed in such cases. Thus, now firefighters can use the already existing, way more powerful (for example it can do live video) and cheap public network, instead of their current communications solutions.
Oh, you meant IoT for infrastructure etc. Yes, that makes a lot of sense. Thanks for explaining!
Your example with the firefighters is a good one, no matter whether they are public sector employees like in the US or privatised like in Denmark. That's why I was confused.
> but that would be impractically expensive given the cost of mobile data
That needn't be the case with 5G -- there's a lot more bandwidth to go around, particularly with mmwave, so home broadband (with unlimited or near-unlimited service) are more feasible. Verizon is piloting a home 5G service in several cities for this reason, and it doesn't have a data cap.
The article says they only gave 5G 5MHz of spectrum. Of course this makes it slower.. In 4G LTE it's common to use multiple blocks of spectrum (carrier aggregation) to provide even higher data throughput.
Yea, 5G does support carrier aggregation with LTE. It looks like the big difference is in LAA (License Assisted Access). LAA allows the use of unlicensed spectrum which can be added to the licensed spectrum that carriers use. 5G NR Release 15 (which is what networks are almost assuredly running since Release 16 just came out in July) doesn't have the ability to use unlicensed spectrum. One of the images in the PCMag article (linked below) shows the different combinations. You can see how LTE with LAA just has a lot of spectrum dedicated to it. 5G ends up with almost the same amount of spectrum as LTE for non-LAA LTE.
And it's important to note that right now, 5G isn't that much more efficient than LTE. Remember, when LTE came out, it was advertised as 5-8Mbps which we'd consider slow today. 5G NR will get better as it evolves. However, right now it looks like it'll be around 20% more efficient in low-band deployments and 50% more efficient in mid-band deployments (T-Mobile had numbers similar to that in some FCC filings during their merger proceedings, IIRC). Things like higher-order MIMO will help mid-band and higher applications more.
Given the chart of relative 5G/4G download speeds and the fact that T-Mobile usually beats their 4G network, it seems likely that AT&T is leaning heavily on LAA for their LTE performance and that's the difference-maker. As Release-16 becomes what networks and phones are using, that will likely change.
Verizon often has 1,000MHz+ of millimeter-wave spectrum and there's virtually no one using it. T-Mobile is getting 300Mbps speeds using only 40-60MHz of spectrum. Even LTE, when no one is using it, can easily hit a few hundred Mbps (and sometimes more) with traditional spectrum. As carriers are able to use double or triple the spectrum for 5G, we're going to get faster speeds.
I think you're right that some halt of the fiber rollout is likely due to the impact that 5G will have. Even if mmWave spectrum can't go very far, it probably offers a good last-few-hundred-feet option. Instead of having to run fiber into someone's home, they can run fiber to their street and let a wireless link handle the last bit. They don't have to deal with construction permits, digging, coordinating with homeowners, etc.
However, I don't know what the costs of 5G equipment are right now and there might be resistance from neighborhoods that don't want a 5G cell-site on top of a bunch of lamp-posts in their area. I don't find them that intrusive, but many people have strong aesthetic opinions on their neighborhoods. It'll be interesting to see how things turn out.
>there might be resistance from neighborhoods that don't want a 5G cell-site on top of a bunch of lamp-posts in their area.
IME, most of the resistance is from people who think that 5G is going to give them cancer. Pseudo-scientific anti-5G posts abound on local facebook groups.
This is the difference between Fiber-to-the-Node and Fiber-to-the-Premises. FttN is already something that ISP's do without it being too newsworthy, but FttP is a much more expensive thing in general, because this last little bit of infrastructure is much more expensive per-customer than just running it to the Node and using existing cable or phone lines to reach your customer.
Google/Verizon's "fiber roll-out" was FttP, and what GP was talking about, and it's very understandable to abandon that approach to just FttN plus 5G.
High-bandwidth wireless data, even using the latest 5G technology, still literally requires ample bandwidth: wide enough radio frequency channel(s) over which to communicate. So far, two major US cellular providers (AT&T and T-Mobile) have allocated so little spectrum space to 5G, that 4G connections are still faster in almost every city tested. For example, on AT&T’s 4G network, phones can use up to seven radio bands simultaneously to increase speeds, but phones on 5G are still limited to a single 5MHz band.
Sounds like this is what empirical testing of AT&T and T-Mobile networks would have expected to find. Verizon’s 5G network isn’t showing this issue because they’re using only millimeter-wave spectrum and equipment, with the trade-off being scant availability because of its tiny usable cell area.
I just drive through Iowa and had 5g almost all the way. A year ago I drove the same route and it was 2g the whole way. So t-mobile has clearly sped things up in rural areas for me. At home I still get 4g, but that is good enough
Give me rock solid and cheaper 4G before jumping the gun to 5G.
How will going from say 40mbps to 250mbps+ on a phone help me in any way over the next 2 years? For video, the limits of a phone display cap the max bitrate below 5G's speed. 1080p at 60fps is around 16mbps.
If you are in a cell with low congestion, probably the 4G speed is enough. Since, the mobile data consumption is increasing rapidly though, cells will start to become congested. 5G ensures that more people will be able to stream video at the same time.
It depends on how often you change phones. If you change phone, keep your phones more than 3 years I would wait. If not, by the time you will really need to have 5G you would be about to buy the next phone, which will certainly have 5G.
More seriously, as more 5G is deployed it should speed up. Also I believe 5G allows more phones to use a single tower at once which should make the carriers job easier. Especially at very busy places like festivals and sporting events.
I live in a country that won't see 5G soon, if ever (that is, not in the US). My point is that the existing LTE networks are plenty fast for all intents and purposes. I don't always notice the difference even between 3G and 4G.
So, then, it comes naturally to ask what is the exact problem that 4G has but 5G aims to solve, besides all the marketing and change for its own sake. Especially that weird kind of 5G that can't go through your hand and has coverage area of a single tower smaller than that of a WiFi access point.
actual development and standardization uses shorter cycles than the 5G hype cycle.
release 8 is slow. your phone would show 4G
release 14 is super fast. your phone would show 4G
release 15 will show 5G but it's just one iteration away from fast 4G.
it brings some cool techniques that boost efficiency (more people in the same cell can download things, increasing the average speed) -- and other stuff. peak download rate is one of the less interesting features, but most prominent.
My understanding is there are two forms of 5G. One works like 4G but faster, and the other (millimeter wave) has the limitations you mentioned but crazy high bandwidth.
We’ll probably get a mix of the two deployed depending on topography, density, cost, etc.
Main source of latency is not the last mile in regards to mobile networks. So yes, the rest of the internet doesn't care, as those routes are not going to change just because you now have 5G.
A quick search leads me to believe that the majority of people use their smartphones as phones; they make calls, they send text messages, they may use apps to buy movie tickets, check their mail, play games. Most of these uses don't really demand a lot of bandwidth. Why should I watch a video on my smartphones 5" screen when I could watch on my 55" TV.
I think the cellular carriers should invest more in improving their coverage, especially in rural areas. Of course, that'll never happen because the carriers would rather focus their investments in the big cities, where people are willing to buy a $1000 smartphone every year.
One of my 'smartphones' has a screen the same size as most of the laptops I've ever owned, 13" rounded up.
Also, you're way out of touch with how 'the youth' (anyone under 30) use the Internet. It's highly video-centric, and sure, depending on the market, keeping that video feed alive away from WiFi can get expensive: but I would rather pay than go without, and I suspect this is often true for most youngsters as well.
Cell carriers spend money where they can make money, and that isn't rural bandwidth. If we want to change that, and I think it's worthwhile to do so, it will take legislation.
The appeal of smartphones is the ability to watch videos, play games, etc anywhere you are (within the coverage areas, of course).
> I think the cellular carriers should invest more in improving their coverage, especially in rural areas
I would say coverage is generally pretty good, especially in the Eastern half of the country. There's very little benefit to providing service in the middle of nowhere when the majority of their subscriber base is going to be in bigger cities.
There are scenarios where you might see massive speeds, though. 5G heavily leverages Carrier Aggregation, where a client can connect to multiple channels at once, increasing bandwidth, and beam-forming, where the "shape" of the signal is modified to steer maximum power to the physical location of a client. If you happen to be near a base station with good Signal to Noise Ratio while downloading something large, the base station can allocate multiple channels and use beamforming to finish your download at Gbps speeds and get you off the network faster, freeing up spectrum time for more users at lower SNR's and bandwidths.
There is potential for much more impactful deployment of 5G cellular in rural areas where there aren't many buildings or walls blocking high frequency signal. The same goes for point-to-point links such as inter-building WAN or low latency self driving car to self driving car communication.