Loading...

AT&T iPhone VS Verizon iPhone Signal Stregnth Test

4,661 views

Loading...

Loading...

Transcript

The interactive transcript could not be loaded.

Loading...

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Uploaded on Feb 10, 2011

http://TheRecoveringBanker.com | Nic Coventry | The Recovering Banker

This is a test I did to prove that Verizon is Much Better than AT&T where I live, and where I use my iPhone. Everyone has different experiences, and uses their phones for different things. I personally use mine at my home office a lot, and I use mine in the Metro Detroit Area a lot, but even then, I don't have full signal most of the time.

--

Thanks to my old buddy Dave Castro I had the opportunity to spend a few hours with the latest iOS device to break sales records—the Verizon iPhone 4.

I was excited to try out this device. We found from previous experiments on the AT&T iPhone 4 that the power efficiency of the device varied greatly depending on signal strength—so much so that it eclipsed even the display as the No. 1 energy consumer while streaming the movie Star Trek! The assumption is that with more consistent coverage here in the San Francisco Bay area and around Palo Alto in particular, the Verizon iPhone might finally bring the promise of a more dependable network, and more consistent network performance.

Most early tests have shown generally better coverage on Verizon than AT&T, although the same tests have confirmed generally faster network performance on AT&T when you can get a signal. So here's the moment of truth: Will we be able to measure the difference in power efficiency of these two devices relative to signal strength? Does the Verizon iPhone 4 exhibit the similar runaway energy usage when in areas of low signal? Here we go...

First test -- I'm going directly for the jugular. Stream Star Trek to the Verizon iPhone 4 in my office, where I can just barely get a signal on my AT&T iPhone 4. This is the case where I wasn't even able to get through the movie on a full battery charge. The AT&T iPhone 4 extrapolated to 5.3 watt hours (Wh) of energy to play the roughly 2 hours and 6 minutes of Star Trek, with just about 1 bar of signal strength. On Verizon, I got an average 3+ bars of signal strength on average, and streamed the movie with no problem using 3.15 Wh of energy. It's looking like a landslide victory!

Driving home, I monitored the signal strength on the Verizon iPhone. It varied between 2 and 5 bars, even in my well-travelled and well-known "dead spots" where I've dropped many calls and data connections. Another moral victory for Verizon. At this point, I'm about ready to chuck my iPhone back at the AT&T rep, and jump ship. But a little more testing is in order...

Next test is simply to rerun the Star Trek streaming experiment at my house. I chose to do it downstairs, where coverage is typically a little worse. And here's where it gets strange. Both phones showed about 2 bars of coverage. The Verizon iPhone varied between 2 and 3 bars, and the AT&T iPhone fluctuated more, between 1 and 3 bars. No big surprise yet, but the bottom was about to drop out.

Streaming Star Trek under these conditions, the first big observation was that the picture on the Verizon iPhone was clearly pixilated. It was having trouble getting the streamed data fast enough, particularly in the fast-moving action sequences. The AT&T iPhone showed some similar artifacts, but only occasionally. This continued, and got somewhat worse throughout the movie. The Verizon iPhone actually stopped playing a dozen or so times during the movie, waiting for the data to catch up. The final results: the Verizon iPhone consumed 4.3 Wh or energy, and the AT&T iPhone consumed 3.15 Wh! No, that's not a typo. The winning number 3.15 Wh, or 60% of the 5.25 Wh battery capacity, was exactly the same as where we started this round of the experiments, except on the other phone! That's some poetic justice.

It's now past midnight, and I still have a few more tests to run. It seems our showdown wasn't nearly as conclusive as I expected. One possible explanation relates back to network coverage vs. network speed. Better coverage is a good thing, but there's an interesting inversion in equally low coverage areas—network speed might help make up for a poor signal in data-intensive applications by allowing the application to buffer more data until the signal quality improves. This, of course, is only a late-night random theory, but an interesting one to pursue. Things just get more and more curious. If you have any interesting ideas, send them along, and we'll continue the experiments together.

Loading...

When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...