The Conversation Had Ended and AT&T Has Finally Hung up the Phone...For Good

By: Consuelo Azuaje

On January 1 this year AT&T finally shutdown its 2G network, and nary a peep nor clatter arose in complaint. We at James Brehm & Associates felt bound by professional and personal curiosity to ask one question: Why? We understand that if a tree falls in an empty forest, that its fall is noiseless, for all intents and purposes. (Nobody heard it, after all.) But the community that had used AT&T’s 2G—the M2M/IoT crowd—heavily populates the Telco forest. So, how did AT&T pull it off, and what made AT&T shutdown their 2G in the first place? Well, let’s take a look at the context.

The past twenty years have shown a substantial increase in mobile broadband data usage and a need for quicker networks with greater data capacity. (Last year, Cisco made the prediction that a continued demand for increased data capacity would drive an 800% increase in mobile data traffic, and AT&T recently reported that data usage on their networks alone had grown by 250,000% since 2007.) To keep up with demand, AT&T and other carriers periodically created newer, faster, more muscular networks which displaced older ones. 5G is around the corner and AT&T needed to free up some spectrum.

Wanting to give M2M/IoT-clients plenty of time to prepare, AT&T announced back in 2012 that it would shutting down its 2G networks by 2017. They made a huge push to retain 2G customers by having them migrate from 2G to to Cat-1, 3G, or 4G networks. They also offered free 3G Samsung Evergreen phones to customers that were still using 2G phones. Of their 4 million 2G customers, they successfully migrated 2 million. The 2 million that were left were mostly M2M users, so their resistance to migration made sense. Their devices, probably low-power, had longer battery/shelf life, so it would have been too costly to replace them all at once.

Plus , Verizon and T-Mobile, still in the 2G game, were available to absorb many of the customers that hadn’t wanted to migrate to different type of connectivity. Verizon has stated that they plan to keep their 2G networks until 2019—and T-Mobile has announced that they intend to keep running their 2G nets until 2020. In fact, T-Mobile was quick to take advantage of the situation and offered AT&T’s doggedly 2G customers free SIM cards, services bundles tailored for IoT-clientele, and the option of switching over to T-Mobile free of charge.

Now that AT&T’s 2G network has been shutdown can it be said that it even occupied a space in the telecommunications world? At one point, yes. And to a certain degree, it still does—just not as much. Like a dying campfire which burns long enough to keep its people warm but slowly cools after they have left, 2G has done exactly what it has needed to do for as long as needed—and now that we’re moving on, it’s just a few short steps away from becoming fully retired. We gathered around it for warmth, looked at the stars, and drew our own IoT constellations—swapping (what seemed like) wild connections between devices and envisioned our IoT future. Then the sun rose, other technologies emerged—and we moved on.

Fresh Out of CES 2017: Asus’s Newest Darling, the ZenBook 3 Deluxe VS the MacBook Pro Bros.

By: Consuelo Azuaje

This year’s CES offered dozens of new laptops and desktops, and among them there was one that particularly caught our eye: the Asus ZenBook 3 Deluxe—a laptop as sleek as it is powerful.

Although the ZB3 Del is an ultra-lightweight, like its predecessor the ZenBook UX330UA, the ZB3 Del is just hair lighter at 2.42 lbs versus the UX330UA’s 2.64 lbs. The ZB3 Del’s intended competition, however, isn’t its previous iteration, but the 2016 MacBook Pro. The MacBook Pro comes in two sizes, the 13” and the 15”. (The 15” model automatically comes with a nifty “Touch Bar” feature, but more on that later.) Regardless of which one you go with, however, MacBook Pro is still going to be heavier. (The 13” weighs 3.02 lbs, and the 15” 4.02 lbs.) The MacBook is nearly equally thin at ~0.6” thickness—both models.

This time around, the Asus team expanded the ZenBook’s display screen from 13.3” to 14” by narrowing it’s already narrow bezel. With a 13.3” display, the 13” MacBook Pro doesn’t measure up to the ZB3 Del in this regard, but the 15” model sports a 15.4” display that exceeds the ZB3 Del.

Outfitted with a zippy Core i7-7500U processor, a 1 TB solid-state processor, 16 GB of RAM, and 3 USB-C ports (2 with Thunderbolt 3 support), the ZB3 Del. pound-for-pound surpasses the previous generation and even has much better connectivity. Compared to the the ZB3 Del., the 13” MacBook Pro has a less powerful processor (Core i7-5500U), a fewer number of USB-C ports (just one), and half as much RAM (8 GB’s worth). The 15” MacBook Pro, however, touts a comparable—probably superior—processor, more USB-C ports (4, to be exact), an equal amount of RAM, and greater SSD capacity (2 TB). In other words, the ZB3 Del falls somewhere in between the MacBook Pro 13” and 15” models in terms of performance. It’s a powerful machine, but not unmatchable.

For users that really care about high-resolution, the ZB3 Del falls behind both MacBook Pro models. The ZB3 Del features 1920 by 1080 pixels, whereas even the MacBook Pro 13” offers more with 2560 by 1600 pixels; the 15” offers 2880 by 1800 pixels.

The keyboard segment of the competition is a bit more subjective. First introduced in their 12” MacBooks, Apple keyboards use the “butterfly” mechanism versus the traditional “scissor” mechanism. Although Apple claimed it makes for a more precise, less “wobbly” typing experience, it took a lot of users a lot of getting used to. In true-Apple fashion, the MacBook Pro keyboard has shallow key travel (0.55 mm, apparently). Shallow key travel is largely a matter of taste, too. Shallow key travel is a hot topic of controversy among Apple users. There’s an unexpected but soon-missed sort of satisfaction to be had from punching your “Enter” key after finishing a long project that simply doesn’t exist when your keyboard has shallow key travel. Put simply, it forces you to type more gently, and after years of typing one way, having to adjust can be annoying. To be fair, Apple has since tweaked their shallow-travel, butterfly keyboards since their first release and received more positive reviews.

On the other hand, the ZB3 Del features a traditional keyboard with 1.2 mm key travel. Both have full size keyboards, and both keyboards are fully backlit. Also: both offer some sort of fingerprint-recognition feature. Whereas the ZB3 Del. features Touch Pad Handwriting app that allows users to (while holding the Fn key) to use their touch pad as a literal notepad that will pick up on their handwriting and transcribe it to the screen, the MacBook Pros offer touch-sensitive strip above their keyboard, which they call a “Touchbar” and which functions as a navigation tool. And reviewers love the Touchbar. They rave about how widely customizable and functional it is.

When it comes to choosing between the ZB3 Del and one of the MacBook Pros, take price into account. The ZB3 Del costs $1,699; the MacBook Pros, 13” and 15”, cost $1499 and $2399, respectively. Ultimately, it comes down to personal preference.


The Connected Conversation: 2017 IoT State of the Industry

We recently teamed up with the good folks at IoT Evolution to reach out to and get some perspective from IoT professionals across the board about the IoT-related trends and challenges they had observed in 2016. In our latest issue of The Connected Conversation, we present the key insights we gleaned, how they have defined IoT’s current state of industry, and what that could mean for the IoT community in 2017. We also discuss the hottest IoT use cases and vertical market applications are and how different, major IoT industry players fared this past year.

Click HERE to read the full issue.

For more information on how to subscribe to The Connected Conversation, please email


Imagining a 5G Future? Assume Hindsight as Inevitable

By: Consuelo Azuaje

Experts forecast minimal, less-than-one-millisecond latency with 5G networks. While the switch from 3G to 4G slashed latency in half, the switch from 4- to 5G is expected to more than decimate latency, dropping it by 95%. Those aren’t just flashy numbers. This unprecedented speed will be put to use by the health and medical industry, connected homes and buildings, secure transport, and asset tracking, and more. To the consumer using a self-driving car, it is the difference between a having a car that—moving at 62 mph—would travel ~1 in (2.8 cm) in the time it took to begin braking, as opposed to 4.6 ft (1.4 m) which is the distance a 4G-connected self-driving car would travel before being able to begin braking.

But, oh, the possibilities: 5G in smart homes and driverless cars, of course, but also in tele-surgery, virtual reality, and an unforeseeable number of new venues. Enter a new Eden, the garden of 5G which—as one journalist so aptly quipped—will “tickle” the senses. The Tactile Internet—development of which began as late as 2012—is an emerging technology which will allow users to both transmit and receive tactile information (a.k.a “haptic interaction”) while still receiving audiovisual feedback.

Realization of the Tactile Internet will demand enormous strides in the field of robotics, as well as (seemingly) impossibly quick data rates, but just as technology has made distance no object to visual and audio communication, the Tactile Internet will conquer distance to allow users to communicate and receive touch, as well. This would grant “double presence” for surgeons whose delicate movements would save otherwise inaccessible patients.

Although, the use of mobile 'phones for data collection and streaming (in addition to voice) has become a norm, the evolutionary destiny of its earilest version—the telephone—was far from obvious when it was first being developed. And there's a good reason for that. I'll explain. Mark Twain once famously declared truth stranger than fiction. The second half of that quote is often forgotten, though. It reads: “but that is because Fiction is obliged to stick to possibilities; Truth isn’t.” Unconfined by our modern-day possibilities and limitations, tomorrow’s engineers and scientists will be able to imagine bigger and see more than today’s. The future will outstrip us, and that’s okay. It's only natural. They can only see as far as the vantage point (which they inherited from us) allows them to. There's some comfort to be found, though, in the fact that this generation's efforts will push the next to greater heights, and so on.

Take Sir William Preece, the distinguished and exacting Welsh engineer who studied under the legendary physicist and chemist, Michael Faraday, and who also personally advanced the field of telephony, for example. As onlookers into the past, we could say that Preece was better equipped and, more than anyone, should have seen the enormous potential of telephones. But, that wasn’t the case. Preece's contributions to telephony when it was still an emerging field have been largely forgotten by the general public and he is more commonly known for this dismissive soundbite: “The Americans have need of the telephone, but we do not. We have plenty of messenger boys.” So, moving toward the future, and the future 5G network, let's think bigger than the proverbial-smart fridge crying “buy more milk,” and throw faith and support into today's developers in whose hands the future partially lies.

Connected Conversation Release: Exploring Low Power Options for IoT Solutions

In this issue, the James Brehm & Associates team looked comparatively at low power options, primarily within Cat-0, Cat-M, and NB-IoT, to better explain what device requirements are and how each option plays within LPWAN. Playing a roll in connectivity, we talk about 5G's impact on IoT and its security risks, along with how the FCC plans on streamlining 5G network rollouts.

Click here to read the full issue:

Redefining the Connected Conversation - IoT Data Analytics Survey Results

In our last issue of The Connected Conversation, we at James Brehm & Associates reviewed the results of the survey, IoT Data Analytics, which we conducted in partnership with IoT Evolution. 

If you'd like to have a more complete picture of how organizations interested in and/or currently using IoT data analytics solutions are faring, please read the compendium, Redefining the Connected Conversation - IoT Data Management & Analytics Survey, which also contains the survey results, but in greater detail. Click HERE to read our compendium of the survey results.

For more information on how to subscribe to The Connected Conversation, please email

The Connected Conversation: IoT Data Analytics Survey Results

Recently, James Brehm & Associates in partnership with IoT Evolution released a survey that covered organizations that are interested in and / or currently implementing IoT Data Analytics and what is and isn't working for them. In this issue of The Connected Conversation, you will find out IoT Data Analytics are standing up to real-life application, and where there is room for improvement. Click here to read the results:

To receive more information concerning this survey, please contact

Industry Leaders Collaborate to Spruce up Legacy Networks with SDN & NFV Technologies

By Consuelo Azuaje

While some are focusing on the future nuts and bolts of 5G, others are looking past the required materials and are, instead, focusing on organizational needs. Take the European Union's recently formed public-private partnership (PPP), the 5G Infrastructure Partnership, for example. Having received €700 million from the European Commission for research, the 5G Infrastructure Partnership's vision has five points: (1) wireless capacity; (2) energy savings of up to 90% per service; (3) communication networks wherein the majority of the energy consumed comes from the radio access network; (4) reduction of service reaction time from ~90 hours to 90 minutes; (5) a network that can support connections from over 7 trillion wireless devices and serving more than 7 billion people. To these ends, the 5G Infrastructure Partnership has begun developing a new management model which involves software-defined networking (SDN) and network function virtualization (NFV).

Despite often being pitted against each other, used jointly, SDN and NFV may just be the ticket to a 5G-future. Networking today has been described as rigid and proprietary by professionals, who often find themselves at odds with the constraints of legacy networks, but SDN allows tech professionals to build a flexible, programmable 5G architecture. The greatest difference between SDN and traditional networking lies in how each system routes incoming data packets. In traditional networking all incoming data packets are responded to and routed uniformly by network switches. Being that the way that network switches handle data packets is written into the device’s firmware, network switches are not programmable. Also, the control and data planes exist on the same network device. SDN, however, separates the control plane from the data plane and allows the control plane controller, known more commonly as the “SDN controller,” to program network switches dynamically in response to data packet flow. By doing so, it brings more flexibility in how networks are deployed and managed, but most importantly, it allows many of the SDN components to be deployed on industry-standard x86 servers.

NFV, on the other hand, is poised to improve network function by focusing—not on separating the control and data planes—but on standardization of devices and replacing the physical with the virtual. However, both would spur replacement of the physical platform with an x86-driven platform—replacing proprietary networking equipment with software running either industry-standard server hardware or on virtual machines hosted on servers. By combining SDN and NFV, industry partners in the PPP reportedly expect five-fold returns on their investments and maintain that collaboration is key to the development of 5G. Competition, they claim, can come later.

NFL's Latest Field Addition: Zebra Technologies

By Jose Gallardo

With the NFL football season in full swing and millions of viewers tuning in to watch America’s most watch television show, one thing is for sure, drama will be at the forefront of the season. Consumption of data in real time by both fans and teams is a hunger not easily filled as insight into the performance of players is like food is to a starving man. For the NFL, this need is now met by a company named Zebra as it connect NFL fans more closely with the players.

This technology is not something new; to be exact, the NFL has utilized it for two years but hadn't made the information public. Now that it has been refined and perfected, the NFL will release the information, including all 32 NFL teams.

Zebra places two RFID sensors inside an NFL players’ uniform, one on the left shoulder pad and one on the right. The sensors will be read by twenty radio receivers placed all around the stadium. The information is then received and processed live to provide feedback on a player’s attributes such as: speed, distance traveled, acceleration and deceleration. The sensors are tracking players 15 times per second so as to give analysts a precise reading on the player statistics with a +/- six inch margin of error.

In the past couple of years, throughout each NFL football game, you see coaches and players with a tablet on the field reviewing formations, schemes and coverages. Moving forward, that may be half of what they look for as users will now receive information on how fast a receiver actually is and even more important how fast the defender that is covering him is as well. It will also be giving coaches a clear advantage into placing receivers with better match ups during games and practice.  The health of a player can also be tracked, coaches can see how the energy of players is being drained from play to play and the ability to judge reaction times as the ball is snapped. Players that begin to slow down in the fourth quarter can be substituted and players with pre-injury symptoms can be spotted before a more serious injury occurs such as the tearing of an ACL occurs.

Some players think that all of this information might hurt them in the long run. Coaches having the ability to precisely measure a players’ efficiency is definitely going to be a factor down the road that will be included in player contracts. But at the end of the day, the NFL will be the winner as they capitalize on a fan’s higher level of attachment to the sport.


This latest issue of The Connected Conversation takes an in-depth look at how the agricultural industry is changing as farmers and ranchers, etc., integrate IoT technology into their day-to-day operations. Also included is a review of recent significant IoT events, such as acquisitions, collaborations, and new technology development and test (e.g. drone testing on 4G networks). 
Click HERE to read the full issue.

For more information on how to subscribe to The Connected Conversation, please email