Articles

Sample Articles from Bob Wallace.

Read More about Bob.

(Source: Pixabay)

IT managers stand to gain faster, symmetric services as fiber network spending climbs.

A steep spike in carrier spending on optical broadband access equipment combined with federal incentives to deliver high-speed service to rural areas is changing the broadband access game and options for IT managers.

This is increasingly being accomplished by the deployment of 10Gigabit Passive Optical Networks (G-PON), which are point-to-multipoint access systems that use fiber optic media to flexibly provide advanced services.

Why? Because operators had first provided T-1 and Ethernet services for IT managers for SMBs seeking more robust corporate networks. But cablecos won many firms back by providing cheaper and faster DOCSIS 3.0 and 3.1-based services. But now, the spurned have an advantage with the deployment of, and upgrades to, 10G PON networks.

Unlike current DOCSIS services, 10G PON-based services are symmetric (the same speed download as upload), something cablecos cannot yet match. They are essential for applications such as surveillance/security/remote monitoring that stream video.

Targeted businesses

“In addition to helping them better compete with cable in the residential broadband market, telcos’ 10G-PON fiber deployments are also aimed at reclaiming share in the small-medium business (SMB) market, where cable operators have also made substantial gains over the last decade,” explained Jeff Heynen, Vice President, Broadband Access and Home Networking at Dell’Oro Group, which conducts market research spanning telecom, networks, and data center IT industries. “XGS-PON allows them to offer symmetric, multi-gigabit services that cable operators will have a difficult time competing with until they expand fiber networks of their own.” Telcos are also expected to put the full-court press on small enterprises.

Review time for IT managers

This emerging shift in dynamics requires enterprise IT managers to explore the capabilities and potential benefits of 10G PON services. Also suggested is ascertaining from carriers their timelines for upgrades of existing plant and deployment of the new GPON infrastructure/technology.

Much of the push for the extension of broadband to underserved and unserved expanses in America is powered by the FCC's Rural Deployment Opportunity Fund (RDOF) program, which is providing carriers funding to reach these areas with high-speed services. Proposed Federal and state broadband spending programs are fueling the broadband breakout.

IT managers should expect an increase in 10G PON deployments in the U.S., even before RODF projects begin to get rolling. While these deployments are primarily for residential applications, there are many operators who see using 10G PON as their connection to small and midsize businesses (SMB) and smaller enterprises.

Spending on 10G PON equipment skyrocketed 500% in the first quarter, according to a newly published report by Dell'Oro Group. Total global revenue for the Broadband Access equipment market increased to $3.3 B in the first quarter of 2021, up 18 percent year-over-year.

Upgrading to G-PON

  • Making the connection. IT managers looking to cash in on GPON services will need a standalone optical network terminal (ONT) unit, a piece of termination equipment. Another increasingly popular option is to insert a pluggable module in an existing switch at the user site.
  • Service availability. IT managers will need to make certain the business services they have been using at a site are offered over the GPON infrastructure. The same goes for the service-level agreements (SLA) they had with their business services. They should check that the agreements cover everything from on the ONT on the user premises to the OLT in the telco network.
  • Protection. IT managers must ascertain how traffic will be protected and secured as GPON is a shared technology. IT managers are used to having strong SLAs. That should not change. But it does require some knowledge and investigation by the IT manager to make sure that is the case.

Starting from scratch – broadband options for rural areas

For IT managers in areas lacking broadband access, the RDOF, and other expand-broadband initiatives, cannot be too sure how they will get high-speed services. The following are the primary options. Time and money are the deciding factors as carriers must balance the cost of providing broadband access along with the time it takes to reach anxious customers.

  • Fixed wireless. A carrier favorite in cases where an antenna and or tower is less expensive and quicker than trenching and laying fiber, especially in un- and under-served rural areas.
  • Satellite. With the re-emergence of low-earth orbit (LEO) satellites; Starlink from Elon Musk's SpaceX, Amazon, Telesat, and OneWeb, high-speed data can be beamed to locations equipped with a package of small on-premises gear.
  • Fiber. A known option that is the priciest of the three (and most time-consuming in new installations), telcos need the speeds possible to get ahead of DOCSIS 3.0 cable service alternatives and be ready when cablecos step speeds up with DOCSIS 4.0 services.

Fiber for the win?

Despite the competitive options, Heynen believes fiber has the inside track when it comes to rural areas that will be addressed as part of the FCC’s massive RDOF broadband breakthrough program.

"I do think the majority of RDOF money is going to be put towards the expansion of fiber projects because of the demand to provide gigabit speeds,'' Heynen began. "However, quite a number of major projects are going to be Fixed Wireless Access (FWA)-based, largely because they address areas where fiber deployments will simply be too costly."

Operators need to balance the speeds promised with the time and cost it will take to roll out full fiber networks. IT managers should also expect carriers that won funds from the RDOF to use a combination of both fiber and FWA.

(Source: Pixabay)

As results, performance factors, and challenges come to light, interest spreads for IT managers.

The Olympics in Japan will use facial recognition this summer. And Disney World announced in May it will test the technology for a month to cut wait times at the theme park as part of a more touchless experience.

But before getting serious about cloud-based or on-premises facial recognition systems, IT managers must first determine whether they will meet the desired performance levels required by the enterprise in its application.

Often lost in the heated debate over whether municipalities and law enforcement entities should be allowed to use facial recognition to identify suspects in crimes is a secondary question; How effective are the AI-driven, software-based systems?

And before we can dig into early returns on performance, we must acknowledge that there are far more verticals that could apply facial recognition – beyond enterprises with access control implementations, law enforcement organizations, and large multipurpose sports venues looking beyond identifying staff.

Vertical value

  • Retail. Identify shoplifters to reduce shrinkage.
  • Transportation. Airport security. Reduce waiting times. Check-in kiosks.
  • U.S. government. Deploy for border control uses.
  • Entertainment. Security for mass events at large venues such as the Olympics, etc.
  • Banking. Replace passwords with screen-based face scanning for online banking.
  • Healthcare. Deploy touchless access to secure lab gear, clean rooms, and more.
  • Media/marketing. Test audience reaction to movie trailers, characters in TV pilots, and optimal placement of TV promotions.
  • Consumer products. Measure visual emotions through facial analysis to pick candidates for new or enhanced offerings.
  • Travel. The US Department of Homeland Security predicts that facial recognition will be used on 97% of travelers by 2023.

Performance, historically

How far have facial recognition systems come? The Technology Policy Program at the Center for Strategic and International Studies (CSIS) produces non-partisan and non-proprietary research, took a look.

Facial recognition has improved dramatically in only a few years, according to CSIS blog author William Crumpler in April 2020. “As of April 2020, the best face identification algorithm has an error rate of just 0.08% compared to 4.1% for the leading algorithm in 2014, according to tests by the National Institute of Standards and Technology (NIST). As of 2018, NIST found that more than 30 algorithms had achieved accuracies surpassing the best performance achieved in 2014.”

But beyond algorithms, how are the systems faring in real-life deployments in the last few years? The answer, of course, is based on how its results are used, not necessarily measured.

Beyond the AI engine

A case in point is the New York Police Department (NYPD), which began a massive storage upgrade in 2009 to support the deployment of its facial recognition systems in parts of the Big Apple, such as the financial district. However, it is important to note that results that the entity released last year are used as one piece of a process for which its owner assesses the effectiveness.

The NYPD has long stated that any “matches” are used as only one part of an investigation. They are just leads and NOT enough to issue arrest warrants for. And before it becomes a useful lead for criminal investigators, a match from facial recognition is analyzed by a group that reviews each and weighs in on the quality – and hence viability - of the match.

Further review

“If possible, matches are identified, then trained Facial Identification Section investigators conduct a visual analysis to assess the reliability of a match and conduct a background check to compare available information about the possible match and relevant details of the investigation,” according to the NYPD FAQ.

Specifically, the NYPD uses facial recognition to compare images obtained during criminal investigations with lawfully possessed arrest photos. When used in combination with human analysis and additional investigation, facial recognition technology is a valuable tool in solving crimes and increasing public safety.

The NYPD measures the effectiveness of its facial recognition software on how it has done in terms of providing substantial leads in criminal cases. Last year, the NYPD released results for the 2019 calendar year. The organization explained that following; “(Its) Facial Identification Section received 9,850 requests for comparison and identified 2,510 possible matches, including possible matches in 68 murders, 66 rapes, 277 felony assaults, 386 robberies, and 525 grand larcenies. “The NYPD knows of no case in New York City in which a person was falsely arrested on the basis of a facial recognition match,” it added.

What affects facial recognition performance

Those in the know cite three variables that can impact the accuracy of a facial recognition system’s ability to make matches. They are the person’s pose, illumination, and expression (PIE) in the photo.

The mask factor

And perhaps not much of a concern before the global pandemic was the use of face masks. It is unclear what the long-term prospects for facemasks use after COVID-19 in the U.S and elsewhere are. But they have left their mark.

Consider this research and testing from NIST. The entity tested 89 commercial facial recognition algorithms. It used digitally applied masks and claimed to have found a 5-50% error rate in matching faces with them to photos of the same person.

Vendors are hard at work improving that error rate. BBC reported that a US Department of Homeland Security "controlled-scenario test" in January found one facial recognition system with a 96% success rate - although the results of those tested "varied greatly."

Environment dependent

And while it is a safe bet that we will see less mask-wearing, save for the winter months, some facial recognition systems will stay have to deal with what is known as facial occlusions. They include longtime items as hats, sunglasses, scarves, and heavy facial hair. These items are known to weaken the performance of facial recognition systems, especially when they are used in applications that use video surveillance.

IT managers with controlled access scenarios in mind for their systems may not need to be concerned with this variable in facial recognition accuracy.

The road ahead

Facial recognition systems are attracting broader interest in the form of tests and actual deployments, by users including Disney World and the 2021 Olympics. But deployments that include the municipal law enforcement entities face headwinds from those concerned with privacy and Big Brother concerns.

Source: Pixabay

Enterprise, law enforcement, and sports venue use expands, raising the need for IT infrastructure evaluation to support the resource-demanding systems.

Whether it is access control for employees, clearing fans to fill a stadium for mass events, or helping identify suspected criminals, users are forging ahead with the evaluation, testing, and implementation of facial recognition systems.

And while law enforcement entities have encountered headwinds in the form of privacy and Big Brother concerns, interested IT managers across industries all need to consider infrastructure requirements for facial recognition systems.

Your biometric options

For many security-conscious IT managers, key cards are giving way to the biometric family of approaches, which use speech, fingerprints, retina scans, and even breath to provide or prevent access to facilities and systems.

Another biometric approach, facial recognition, found its way from science fiction and spy movie material at the turn of the century to actual use by law enforcement, the military, and sports venues.

Early uses

Largely unbeknownst to the masses, facial recognition was tested/used at Super Bowl 35 in Tampa in 2001, just months before the 9/11 terrorist attacks on New York City and the Pentagon.

Since then, the New York Police Department has pushed to implement facial recognition but has run head-on into opposition by privacy advocates, the American Civil Liberties Union, and those fearing the big brother use of the advanced technology systems.

Seeing its value for limited access control applications, the NFL’s New Orleans Saints recently disclosed it implemented a facial recognition software system from AnyVision in December 2019. The team explained has been used to grant access to players, staff, and other workers secure, seamless access to practice facilities. Not long after, COVID-19 hit the U.S.

How matchmaking works

Facial recognition is used to identify or confirm a person's identity using their face. It can be employed to identify individuals in pictures, videos, or in real-time.

The systems use algorithms to spot distinctive details of a person's face, such as the location of their eyes, the shape of their nose, or the appearance of their chin. By combining these attributes, they can read the geometry of a face to match it to an image. Most facial recognition systems use 2D images than 3D because it is easier to match a 2D image with public photos or those in a database.

The system then converts the facial image to data. How does that work? The best explanation of this step comes to us from security software vendor Norton: “The face capture process transforms analog information (a face) into a set of digital information (data) based on the person's facial features. Your face's analysis is essentially turned into a mathematical formula. The numerical code is called a faceprint. In the same way that thumbprints are unique, each person has their own faceprint.”

This faceprint is then run through a database of faces seeking a match in what can be a huge repository of many millions of images, with the largest found in law enforcement and crime-fighting organizations.

If your image is matched with one in a facial recognition database, the system notifies authorities who decide if it is a close enough match to act on.

Infrastructure issues

The infrastructure required to support an effective facial recognition operation depends on how the technology is being used and the volume of images used to attempt matches. For example, if an enterprise is using facial recognition in an access control application to confirm the identity of employees, entities may be able to upgrade their current corporate IT infrastructure.

However, if a large, distributed enterprise is considering an implementation for all employees and workers, network and storage systems might need to be dedicated or augmented, along with computing power in the data center.

In either instance, high-speed network connections are a critical piece of the facial recognition system to ensure the low latency needed for high-volume scanning and matching in time-sensitive applications, such as access control.

Network needs

In a metropolitan law enforcement implementation, a network of surveillance cameras serves as the foundation for the facial recognition software, which pulls images stored in a large database with massive storage.

By contrast, a solution for an NFL venue could run on the arena's backbone network, with upgrades in storage in the facility's data center. In newer – multi-purpose - NFL stadiums such as Atlanta Mercedes-Benz Stadium’s IT infrastructure is a data center, which processes all data on game day, and backs that up to the vendor’s cloud that same day.

Originally used by telecom carriers to deliver triple-play service bundles to the home, a Gigabit Passive Optical Network (GPON) is used to connect a wide array of high-speed devices and systems to the data center. The Mercedes-Benz GPON uses 4,000 miles of fiber-optic cable. GPONs can handle the low latency needs of high-volume scanning and matching.

When it hosted Super Bowl 53 in February 2019, the facility’s backbone network supported:

  • 15,000 Ethernet ports
  • Wi-Fi equipment
  • 700 POS devices
  • A security system
  • Keycard accessed doors

Though the NFL had used facial recognition as early as 2001, there have been evaluations of the tech by individual teams. Some are operating small image volume applications to verify the identity of these employees and players entering stadium facilities.

From staff access to fan security

IT infrastructure revamps could be in the plans should sports teams expand their facial recognition implementations beyond players, staff, and suppliers, to includes the gameday fan crush of 65,000 to 90,000 people with all rushing to quickly pass through often jammed stadium entry points.

The same goes for older stadium venues that have added infrastructure over the years to support Wi-Fi 6, 5G service, and advanced video replay systems for fans and TV broadcasters.

Source: Pixabay

Tracing their roots to “moneyball“ 20 years ago, sports businesses believe their data-driven decision efforts can help enterprise IT.

When it comes to using tech to optimize core business processes, business-to-business (B2B) and business-to-consumer (B2C) entities are often more similar than dissimilar. And when it comes to analytics, sports leagues, teams, and federations think enterprise IT can learn from their experiences.

Think about it. Traditional enterprises, and sports entities, exist in competitive industries. Both invest heavily in data collection from their assets (business systems and workers vs. players and stadiums). And each grapples with implementing new processes, expanding remote access, and addressing security challenges, to name but a few.

Both types of businesses are focusing on enabling data-driven decisions and analytics designed to help them answer strategic questions before the application of AI and Machine Learning.

Sports entities have been implementing analytics since the turn of the century, as illustrated in the movie Moneyball. What can enterprises learn from sports analytics?

Ask the expert

Who better to ask than Christina Chase, Co-founder and Managing Director of the Massachusetts Institute of Technology’s Sports Lab.

The lab with pro-sports teams, leagues, federations, and global brands to accelerate new technologies and approaches in areas such as player identifications and development; in-game strategy, athlete health and performance, next-gen fan engagement and OTT, smart stadiums/venues; high-performance equipment, sensing and fabrication; and e-sports, explained Chase. “And they’re having the same challenges you might be having.”

The lab’s clients include FIFA, Adidas, Red Bull, Major League Baseball, winter sports equipment maker Shred, Google Cloud, The Milwaukee Brewers, and the United States Olympic Team.

(Christina Chase will be keynoting Data Center World August 16-19, 2021. Use code NWC to save $200 on registration.)

Tech first and foremost?

Maintaining that technology alone is not the solution to firms' need to analyze the fast-growing mountains of data collected from countless endpoints in business and in sports, Chase suggests organizations first identify questions that must be answered before talking tech. They are:

  • Figure out what you want to answer, evaluate your organization’s capabilities/capacity, then look at solutions.
  • Give yourself adequate time to vet potential solutions, including multiple contained pilots before selection.
  • Understand how this new data will integrate into existing data streams and processes and how it will be used.

Chase emphasized that without answering these questions, users will find that more data does not mean better insights and that tech is not the answer but rather a tool. And better technology does not mean an enhanced outcome.

A case in point: FIFA’s VAR

Soccer league FIFA embraced the MIT principles in its efforts to determine how to help the on-field referee eliminate clear errors in officiating matches. The data-driven decision was to create a Video-Assisted Referee (VAR), an additional referee, located off the field. The VAR reviews game calls, penalties, goals, and more. The VAR has access to video on a tablet device and has direct voice communications to the chief referee on the field.

FIFA included stakeholders - the referees - in the process early on, explaining that the VAR would cut errors in calling the match by adding video review. It was not designed to replace their game calling.

 

FIFA discussed and tested different methodologies with referees. The input from the VAR was not intended to be the final outcome for the on-field chief referee but rather to help eliminate clear errors in calling the match.

The next steps included determining how to collect and move video around the venue and how to include this with other data streams, in part to help referees render a review that could be included without adversely disrupting the live television broadcasts of FIFA matches – and coverage on screens for fans in the stands.

Although an earpiece was considered for communication, taping the ear was not deemed an effective way to inform players, coaches, and fans at home and in the stands know that a play was being reviewed. Instead, referees began drawing an imaginary box in the air, so all parties knew of the review.

The VAR, which was used at the 2018 FIFA World Cup at www.FIFA.com/VAR. was a big change beyond eliminating clear mistakes. Instead of players running up to the referee who had blown the whistle to debate the call, they knew from the box signal the call was being reviewed by the VAR.

What do sports operations collect?

Sports enterprises currently collect rivers of data from athletes on the field, from RFID chips in shoulder pads in the NFL to data from sensors in race cars and the suits of their drivers. These entities collect this data to understand and optimize the performance of their most precious assets, not unlike corporations.

Data is also collected from these athletes off the field and track in hopes of better understanding and optimizing the training, conditioning, and health status of these same human assets. This has lead to the development of wearables that monitor and deliver specific health information to trainers and equipment makers.

What about remote workers?

The remote worker surge is turning into a tidal wave thanks to COVID-19, during which the lion’s share of businesses were required to have employees work from home. Remote access is nothing new to businesses, but the scale at which it was done is still unprecedented. This required a rethinking of the way employees work (and how education is delivered) for most businesses.

Offices went dark while Zoom, Microsoft Teams, and other collaborative tools filled the gap.

The global operation challenge

Goals achieved and successful solutions used by international sports organizations can be used as a model for global enterprises and provide the consistency required across countless, far-flung locations and remote and mobile workers. Data collection process standardization achieved with FIFA's VAR simplified rollouts across venues in the sports league.

In sports, racing teams were already supporting remote workers in fixed locations as well as entire race operations that travel to international races. This included everyone needed to field and run a racing team. That includes the pit crew, technicians, support staff, the vehicles, the supplies, and, of course, the drivers.

Avoiding data loss

Securing endpoints is a seemingly never-ending battle. Supporting fixed remote and mobile assets is one tall task, but securing all these diverse and far-flung network entry points is of paramount importance in the sports business and for traditional enterprises as well. Organizations in both groups must do so to protect data transmission and collection and to prevent against loss of their prime asset, intellectual properties (IP), in fiercely competitive industries.

Case in point: Going global with Williams F1 Racing

Facing these challenges, Williams F1 Racing chose to partner with DTEX Systems, Inc., which handles empowers the virtual workplace through its endpoint data loss prevention (DLP) systems. The vendor’s offering has enabled the racing company to meet the demands of the operation before and during Covid times, explained Al Peasland, Head of Technical and Innovation Partnerships at Williams F1 Racing.

“We travel to races in over 20 countries a year in what is a fast-paced business,” explained Peasland. The company has roughly 700 employees at its U.K. headquarters, as well as remote fixed workers and remote mobile workers, in addition to its country-hopping road race crew. At a race, everyone from the pit crew to the driver and the group that builds the garages for the cars needs access to data, and since there’s no such thing as a 9-5 day, some are coming in from a hotel network after hours.

Once at the track, the team needs become more complicated as the staff collect data from sensors throughout the race cars and their drivers, according to Peasland. Further still, sensors on drivers’ suits that collect data from the driver up until the race are removed before races to reduce the weight of the car and increase its performance.

The racing company uses a system from DTEX for workforce analytics and security to identify risky activities and employees. Enterprises in other competitive industries, such as pharmaceuticals and manufacturing, also want to track data to the extent of ensuring employees don’t take IP information or secrets from one employer to another.

The road ahead

Before collecting data on the road to analytics, enterprise IT managers may want to check in with their brethren in the fast-evolving sports industry, given their similarities in goals.

 

 

Source: Unsplash

The FCC’s RDOF has begun to award $20 billion to service providers. Winners are committed to broadband expansion to under and un-served areas nationwide.

An injection of billions in funding for the deployment of high-speed networks beyond the reach of those today gives hope that rural broadband will not forever continue as an oxymoron or as a bridge too far for IT managers. The rural broadband issue has gotten renewed attention with IT managers having to support a home-based workforce.

Launched last year, the Rural Digital Opportunity Fund (RDOF) is the FCC’s latest step to fund service provider deployment of broadband networks in rural America. The agency is directing $20.4 billion, over ten years, to finance up to gigabit speed services in unserved rural areas, connecting millions of American homes and businesses to digital infrastructure.

A variety of service providers, including cable companies, telcos, wireless operators, and satellite firms, contended for the funds from the first phase of the RDOF program, which is an extension of the long-running Connect America Fund (CAF) initiative.

Those outlining plans (see below) include Charter, Windstream, LTD Broadband, and SpaceX.

Performance Tiers

The FCC created four performance tiers for RDOF funding. These tiers were weighted, with faster speed, lower latency levels receiving more consideration for funding. The three tiers include a minimum of 25/3 Mbps, a baseline of 50/5 Mbps, an above baseline of 100/20 Mbps, and a gigabit tier of 1 Gbps/500 Mbps.

In the first phase of the reverse auction, $16 billion of the over $20 billion was assigned to address an estimated six million-plus homes and businesses in census blocks that are entirely unserved by voice and broadband with speeds of at least 25/3 Mbps.

Phase II, with a budget of over $4 billion, is focused on those areas that the FCC deemed partially served as well as any areas that were not won in the first phase. It will occur once the FCC receives crucial updated broadband availability data.

Technologies and pricing guideline

The agency is largely leaving it up to carriers which fixed broadband technology they employ to meet the performance ratings mentioned above. In addition, RDOF winners must offer standalone voice service and offer voice and broadband services “at rates that are reasonably comparable to rates offered in urban areas.”

Timeline – a reality check

When will IT managers see the fruits of the FCC’s RDOF efforts?

“I see the RDOF as being impactful beginning in 2022,’’ explained Jeff Heynen, Vice-President of Broadband Access and Home Networking for the Dell’Oro Group, a market research and consultancy. “The operators who were successful bidders still have a way to go regarding site planning and overall preparation before they even get underway with these networks later this year and next year.”

The 2009 stimulus bill, CAF, and CAF II projects showed that these endeavors invariably take a long time to reach fruition, noted Heynen. “So, I am not expecting these to move any faster. I believe this is also consistent with what most of the equipment suppliers (such as ADTRAN, Calix, etc.) have said, as well.”

Plan now, benefit later

Enterprise IT managers in the targeted unserved and underserved rural areas can begin planning for rural broadband later this year. This will entail the following:

  • Engage service providers to learn where they plan to offer broadband connectivity.
  • Work with them to understand what type of service they are planning, where they might pass locations, and whether they will provide it with fiber, fixed wireless, or something else.
  • Determine what type of upgrades will be needed to support the new speeds and services that might be offered by ISPs.
  • Begin to focus now on what lead times are for the devices needed for the upgrades as the industry fights through the issues that are affecting supply chains during COVID-19 times.
  • Be certain that the timeframes for deployment consider the site permitting process in the designated communities.

Winners commit: LTD, Charter, and Windstream

LTD Broadband, a wireless internet service provider (WISP), landed $1.3 billion from the RDOF auction and plans to provide high-speed broadband to unserved areas of 15 states. The company has pledged to provide gigabit speeds to rural areas with its RDOF funds using fiber when necessary to achieve those gigabit speeds.

Cable giant Charter Communications recently announced that the $1.2 billion it landed from the FCC's RDOF program will help drive its planned $5 billion fiber buildout undertaking, which will span 24 states.

Charter says the program alone will drive a 15% increase in Charter's network mileage coverage while expanding its service to more than 1 million previously unserved businesses and homes across the 24-state region.

Charter claims it will be providing the faster speeds with no data caps, modem fees, or annual contracts in place. The company says it will also be able to pitch its mobile voice, VoIP voice, and TV services or bundles to more customers across the rural areas.

Windstream landed over $520 million in the RDOF, funds it will use to expand FTTH internet over ten years to nearly 200,000 locations across its 18-state footprint, according to the carrier. The company’s CEO said he hopes to have shovels in the ground in the second quarter.

Windstream will begin work in all 18 states at once as it uses internal resources and external partners to do the construction. The carrier claims the RDOF build will expand its footprint by 2.5%.

Source: Pixabay

Enterprises to benefit from COVID-19 driven distribution advances, cost reduction efforts, and energy efficiency options.

Few IT managers will be entrusted with asset tracking and monitoring of deep-frozen, lifesaving COVID-19 vaccines distributed worldwide during a once-in-a-century global pandemic.

However, most all IT managers will tell you that keeping tabs on their precious products and related assets in an efficient and affordable manner remains a top priority for their enterprises.

Looking forward, thanks to breakthroughs in printable label trackers, more energy-efficient IoT units, and the interconnection of national carrier networks is shortening the long road to the democratization of powerful asset tracking and monitoring.

IoT tracking considerations

One technology for IoT communications is called Narrowband IoT, or NB IoT. It is in the low-power wide-area networks (LPWAN) category, linking devices that need small amounts of data, long battery life, and low bandwidth.

Another approach is called LTE-M. It is short for Long-Term Evolution, Category M1. This scheme enables IoT devices to connect battery-powered devices directly to a 4G network without a gateway.

The core components of IoT asset tracking and monitoring are multi-function modules, power sources, sensors, and flexible communications networks. A new raft of advances promises to power the two capabilities forward from medium use to the eventual goal of massive IoT.

Powering up(ward)

Though the cold chain is near the oldest IoT use case, it is a prime example of one that needs more powerful trackers, which currently use battery power. Assets like vaccines often need to be shipped extra-long distances, in some cases halfway around the world.

The answer is longer-life batteries that can handle heavier power consumption to better support the broader asset monitoring. Asset tracking and monitoring use cases that do not span the globe and may be regional or national can get by with less powerful and less expensive units.

Look ma, no batteries!

Though most of the asset tracking and monitoring operations feature various types of batteries, with the price of many dropping as the market evolves, there will soon be an alternative that promises to cut energy costs for outdoor assets.

Last November, SODAQ of Netherlands invented a solar-powered cellular IoT asset tracker which does not need a battery. The unit is the first of its kind, according to Nordic Semiconductor, which supplies the NB-IoT/LTE-M chip into the device.

The device, dubbed Track Solar, is not one of those super-cool-but-never-to-be-seen gadgets we'd often see at the Consumer Electronics Show every January. And it is not slide-ware. That is because SODAQ has already deployed the innovative device for a cattle-tracking application that uses solely harvested sunlight for its energy source.

For those seeking the device specifications, the solar-powered unit weighs 100 grams and includes a light sensor, accelerometer, temperature sensor, and status LEDs, and is powered by a 0.5-Watt solar panel. Its positional accuracy is said to be 100-plus meters for cellular. It also works with Wi-Fi and GPS. “By eliminating the need for battery replacement, the tracker… also eliminates the single biggest cost factor in large-scale IoT tracking installations and so supports high volume, low-cost cellular IoT tracking applications,” according to the company. But it is not the only breakthrough expected to drive broader IoT deployments.

It is well suited to a range of logistics, offshore, site management, and general asset tracking applications, according to SODAQ.

Bayer targets massive IoT headaches

Aiming to simplify, smarten and reduce costs of tracking assets in widescale IoT deployments, pharmaceutical, and life sciences company Bayer announced last October a printable NB-IoT-based tracking label, which goes for a couple of euros (as opposed to about 5 euros each) to monitor its products through the supply chain.

The global enterprise partnered with carrier Vodafone, which has been busily interconnecting wireless networks to support the growing need for roaming to support broadscale asset tracking, and other key innovators.

Bayer’s smart tracking label includes cellular SIM (iSIM) functionality in the communications module, a battery, microprocessor, antenna, modem, and a few sensors. The company worked with Vodafone, Arm, chipmaker Altair Semiconductor and module manufacturer Murata.

The goal of the broad undertaking is to “constantly track anything, anywhere,” according to Altair.

Linking country networks

With the steady and growing demand for IoT networks and managed services for tracking and monitoring – be it for cattle or vaccine shipments – carriers have been connecting their networks to enable roaming beyond country borders.

This trend accelerated last year with Deutsche Telekom connecting their national networks with those operated by Vodafone Business, Telia, Swisscom, and others. The GSMA Association has estimated that 18 European countries are offering NB-IoT roaming, and that was as of last year. AT&T has connected networks with Vodafone.

Enterprise IT managers should expect greater coverage this year and beyond, as additional countries interconnect their national networks to offer corporations great coverage for tracking and monitoring using the IoT approach.

Ask questions first, act later

With most all advances, breakthroughs, and other promising developments comes the need for enterprise IT managers to also avoid taking fundamentals - such as SLAs – for granted.

Enterprise IT managers would be well advised to also ask network providers about "how they plan and design networks, the robustness of their network server, the administration tools they provide, and how they qualify and support sensor-enabled IoT devices at scale," cautions Bruce Chatterley, CEO of IoT network operator Senet. Beyond network infrastructure, they need to inquire about the levels of customer, and technical support offered."

Subcategories