The multipurpose network is significantly more cost-efficient than specialized or dedicated network solutions, making it the most affordable solution to address society’s needs across the spectrum from human-to-human to human-to-thing and thing-to-thing communication. It supports everything from traditional voice calls to immersive human-to-human communication experiences. In terms of human-to-thing communication, it enables everything from digital payments to voice-controlled digital assistants, as well as real-time sensitive drone control and high-quality media streaming.
With regard to IoT communication, the ubiquitous connectivity provided by the multipurpose network enables the creation of a physical world that is fully automated and programmable. Examples of this include massive sensor monitoring, fully autonomous physical processes such as self-driving cars and manufacturing robots, as well as digitally-embedded processes such as autonomous decision making in tax returns. The ongoing evolution toward the future network continues to rely heavily on the five key technology trends that I outlined in last year’s trends article. Therefore, in this year’s technology trends article, I have chosen to build on last year’s conclusions and share my view of the future network platform in relation to those five trends, with one addition: distributed compute and storage.
The cybers ecurity trends of the past few years are any indication, cybersecurity cannot be put on the back burner, at least not without costing organizations a fortune– financially and image-wise. We have seen the big players fall or face immensereputational damage and many small businesses shutting down as a result of cyberattacks.Cyber security trends in 2019 have majorly impacted the industry, the business world, governments, and the general public and have caught everyone’s attention for good and bad. In this article, we will take a look at the cyber security trends that are likely to shape the industry in 2020. Data security will be a top priority Data is the new oil and data breaches will continue as long as data remains a valuable commodity. As organizations understand the negative impact of data breaches and with data security and privacy guidelines like the GDPR, mitigation of data breaches through heightened and proactive web application security measures will be a top priority for organizations. Cloud security measures for end-user trust As the confidence in cloud computing is increasing and more business processes, infrastructure, and data are moving to the cloud, new challenges have emerged. Cloud-based security threats owing to misconfigured security measures have seen a stark increase in the past 2 years. In 2020, cloud-based service providers will include stricter security measures, intelligent, managed WAFs, and security testing features as an integral part of their offerings for improved end-user trust. With the rise in cloud adoption, one of the key aspects to keep in mind is infrastructure security. It is provided by cloud vendors, but the business is responsible for the security of the workloads hosted in the cloud. 2020 will see such shared security models being strengthened.
AI will increasingly be monitoring and refining business processes While the first robots in the workplace were mainly involved with automating manual tasks such as manufacturing and production lines, today's software-based robots will take on the repetitive but necessary work that we carry out on computers. More and more personalization will take place in real-time This trend is driven by the success of internet giants like Amazon, Alibaba, and Google, and their ability to deliver personalized experiences and recommendations. AI allows providers of goods and services to quickly and accurately project a 360-degree view of customers in real-time as they interact through online portals and mobile apps, quickly learning how their predictions can fit our wants and needs with ever-increasing accuracy. More and more personalization will take place in real-time This trend is driven by the success of internet giants like Amazon, Alibaba, and Google, and their ability to deliver personalized experiences and recommendations. AI allows providers of goods and services to quickly and accurately project a 360-degree view of customers in real-time as they interact through online portals and mobile apps, quickly learning how their predictions can fit our wants and needs with ever-increasing accuracy.
More devices will run AI-powered technology As the hardware and expertise needed to deploy AI become cheaper and more available, we will start to see it used in an increasing number of tools, gadgets, and devices. In 2019 we’re already used to running apps that give us AI-powered predictions on our computers, phones, and watches. As the next decade approaches and the cost of hardware and software continue to fall, AI tools will increasingly be embedded into our vehicles, household appliances, and workplace tools. Human and AI cooperation increases more and more of us will get used to the idea of working alongside AI-powered tools and bots in our day-to-day working lives. Increasingly, tools will be built that allow us to make the most of our human skills – those which AI can't quite manage yet – such as imaginative, design, strategy, and communication skills. While augmenting them with super-fast analytics abilities fed by vast datasets that are updated in real-time.AI increasingly at the “edge” Much of the AI we’re used to interacting with now in our day-to-day lives takes place “in the cloud” – when we search on Google or flick through recommendations on Netflix, the complex, data-driven algorithms run on high-powered processors inside remote data centers, with the devices in our hands or on our desktops simply acting as conduits for information to pass through.
The web community has experimented with VR before, with VRML, but WebVR takes a new approach to VR, one more suited to the modern web, accelerated 3D on the web since 2011 with the release of Web GL. Now the web can handle VR thanks to new web APIs that take advantage of VR hardware using Web GL.These APIs enable Web GL content to be displayed in 3D with a VR headset. They also provide headset and controller tracking information to give the user presence in the virtual world.
Augmented reality technology saw its record growth in 2019. Commercial support for AR is positioned to be strong, the installed user base for AR-supporting mobile devices reached 1.5 billion. Industry players in the augmented reality world expect 2020 to be a year marked by an uptick in the pace of industry growth. A bulk of the latest advances in the field of AR were showcased at a number of tech events, some of which, such as Augmented World Expo and Consumer Electronics Show, were attended by our team. They inspired us to gather these 9 trends that will shape the future of augmented reality over the next couple of years—and may inspire you on your own innovations. Based on a report from Gartner, at least 100 million users were expected to utilize AR-enabled shopping technologies by 2020, which is one of the hottest retail trends of this year. The boom in mobile devices that employ AR means the sector is now occupied by robust and mature technologies. Developers, retailers and customers are now comfortably using them as part of their daily experience.A BRP report indicated that 48% of consumers said that they’d be more likely to buy from a retailer that provided AR experiences. Unfortunately, only 15% of retailers currently put AR to use. Only a further 32% of retailers stated they plan to deploy virtual or augmented reality applications over the next three years an earlier version of the standard was available for desktop Chrome, Firefox and Samsung’s virtual reality web browser: Samsung Internet for Gear VR.These days, the standard is incredibly well supported on phones and desktop computers for almost all major headsets.The Web VR standards are worked on in the open, and they represent a collaboration between Mozilla, Google, Samsung, Oculus, Microsoft and, recently, Apple.
5g Wireless Use Case The revolution, like all others, will be subsidized. The initial costs of these 5G infrastructure improvements may be tremendous, and consumers have already demonstrated their intolerance for rate hikes. So to recover those costs, telcos will need to offer new classes of service to new customer segments, for which 5G has made provisions. Customers have to believe 5G wireless is capable of accomplishing feats that were impossible for 4G.If we're being honest (now is always a good time to start), it's incorrect to say that 5G is the fifth generation of global wireless technology. Depending upon whom you ask, and the context of the question, there are really either four or seven generations, and only three sets of global standards. There was never really an official "1G." There were several attempts at standards for digital wireless cellular transmission, none of which became global. The term "2G" is credited to Finnish engineers to characterize the technological leap forward that their GSM standard represented. However, much of the rest of the world used CDMA instead, which was also "2G." So there was never a single, uncontested 2G.The first ever 6G wireless cellular mobile communications symposium took place in March 2019 and can be framed into one big vision statement of ubiquitous wireless intelligence. The 6G system is expected to witness an unparalleled revolution that would significantly distinguish it from the existing generations and will drastically re-shape the wireless evolution from “connected things” to “connected intelligence.” Specifically, 6G will transcend mobile Internet and will be required to support ubiquitous AI services from the core to the end devices of the network. Quintessentially, AI will be the driving force in designing and optimizing 6G architectures, protocols, and operations. The current study aims to present the latest state-of-the-art developments in relation to the vision, challenges, and potential solutions as well as the research activities for 6G communications. Accordingly, it attempts to integrate many likely solutions. Given the size constraint, this study thoroughly examined thought-provoking research areas by detailing their specific sub-domains to achieve precise, concrete, and on-the-spot deductions. The major contributions of this study are summarized as follows Vision and Key Features for Future 6G Networks Given the massive capabilities of 5G cellular mobile wireless communications networks and their likely evolution, is there any tangible rationale for 6G networks? If yes, then, what are the missing units from LTE and 5G that 6G must integrate? Academicians, industries, and research communities have set out research modalities on the formulation, definition, design, and identification of important core-enabling technologies driving the initiation toward a “beyond 5G” or 6G system. This section will cover a large range of topics discussed in recently published works about the vision and key features of 6G communications. First, this section starts with a brief view of the expected applications that will be supported by 6G communications and which will lead to identifying the key features that are required in such communications.
The wireless sensor networks are emerging as a useful configuration of sensor entities that are capable of providing solutions for diverse applications under various unexpected and difficult situations. Recently, these networks are permeating in almost all areas where sensing of some physical quantity is the primary application. In addition to sensing applications, these networks are used to further assist in evaluating different situations in tandem with numerous higher level applications. While the application of sensor networks seems to be diverse, there are many challenges that need to be mitigated in order to effectively implement practical systems. Due to the paucity of resources and unexpected nomenclature of these networks, it is difficult to control and perform management operations in an effort to satisfy the requirements imposed by various applications of the users. The operations of the wireless sensor networks are seriously hampered by various limitations pertaining to energy management, optimum bandwidth utilization, and connectivity. In order to cater to personal, commercial, and military applications, it is important to improve the sensor network lifetime.The distributed and volatile structure of the sensor nodes that are sometimes deployed in eccentric environments, where it is difficult to control these networks, may render access to sensors or reduce lifetime if the energy is not efficiently managed. In monitoring applications, which is typically critical mission, it is important to ensure efficient management, accuracy, and reliability for sensing applications and associated hardware. The accuracy also plays an important role where wireless sensor networks are deployed for tracking applications. Efficient algorithms are implemented that process critical data gathered from the deployed sensors in order to accurately achieve tracking capability. The cognitive radios are emerging and evolving in response to requirements of the current bandwidth hungry users, which require reliability and higher throughputs in already congested and scarce spectrum. Although the cognitive radios are evolving with their own issues and challenges, the amalgamation with sensor networks is also promising, as numerous requirements of the sensor network users may be accommodated.
We cast the place recognition step as a classification problem and propose an efficient search space reduction considering only navigable areas where the user can be localized. Classification hypotheses are then discarded exploiting temporal consistency a relative tracker that exploits only the sensor input data. The solution uses a compact classifier whose representation scales well with the map size. After being localized, the user is continuously tracked exploiting the known environment using an efficient data structure that provides constant access time for nearest neighbor searches and that can be streamed to keep only the local region close to the last known position in memory. Robust results are achieved by performing a geometrically stable selection of points, efficiently filtering outliers and integrating the relative tracker based on previous observations.
The rapid uptake of mobile devices and the risingpopularity of mobile applications and services pose unprecedented demands on mobile and wireless networking infrastructure. Upcoming 5G systems are evolving to support explodingmobile traffic volumes, real-time extraction of fine-grained analytics, and agile management of network resources, so as tomaximize user experience. Fulfilling these tasks is challenging,as mobile environments are increasingly complex, heterogeneous,and evolving. One potential solution is to resort to advanced machine learning techniques, in order to help manage the risein data volumes and algorithm-driven applications. The recentsuccess of deep learning underpins new and powerful tools thattackle problems in this space. In this paper we bridge the gap between deep learning and mobile and wireless networking research, by presenting acomprehensive survey of the crossovers between the two areas. We first briefly introduce essential background and state-of-theart in deep learning techniques with potential applications to networking. We then discuss several techniques and platforms that facilitate the efficient deployment of deep learning on to mobile systems. Subsequently, we provide an encyclopedic reviewof mobile and wireless networking research based on deep learning, which we categorize by different domains. Drawing from our experience, we discuss how to tailor deep learning tomobile environments. We complete this survey by pinpointing current challenges and open future directions for research.
The Internet of Things (IoT) is an important research area, and substantial developments for a wide range of devices and IoT platforms is evident. However, one of the critical issues in IoT is that the different proprietary IoT platforms and systems are still not interoperable; unable to talk with each other. In this paper, we survey the state-of-the-art on interoperability in IoT. First, we provide a classification of techniques and schemes looking at IoT interoperability from different perspectives. For each category, we present the approaches proposed in the papers. Second, we use the interoperability classification as a baseline to compare some of the existing IoT research projects and identify gaps in the existing solutions. The current IoT market is fragmented due to the extreme degree of heterogeneity interms of device protocols, controllers, network connectivity methods, application protocols, standards, data formats and so on. The absence of interoperability in IoT is due to a lack of standardisation . Vendors are intentionally defining different IoT platforms, proprietary protocols and interfaces which are incompatible with other
solutions. Therefore, these vendors create different verticals and mostly closed ecosystems, which are sometimes called stove pipes or silos. To be precise, the components in one silo cannot talk to the components in another silo. For example, currently, before customers can access different IoT things they generally need a dedicated application for that particular thing preloaded onto the smartphone.. This way the customer will have many devices, each with their own application, that work independently of each other. Also, there are data interoperability issues when developers want to create an innovative IoT application exploiting resources from different IoT applications and or/services in heterogeneous domain. These issues ultimately lead to vendor lock-in of end-users.
Industrial IoT (IIoT) refers to the application of IoT technology in industrial settings, especially with respect to instrumentation and control of sensors and devices that engage cloud technologies. Recently, industries have used machine-to-machine communication (M2M) to achieve wireless automation and control. But with the emergence of cloud and allied technologies (such as analytics and machine learning), industries can achieve a new automation layer and with it create new revenue and business models. IIoT is sometimes called the fourth wave of the industrial revolution, or Industry 4.0. The following are some common uses for IIoT:
The Department of Space, in partnership with the Department of Telecommunication and the Department of Science and Technology, framed the Satellite Communication Policy in 1997 (SATCOM Policy). Through the SATCOM Policy, the government aimed to develop a strong satellite communication service industry in India and thus, the emphasis of the policy was on (a) developing satellite communication, launch vehicles and ground equipment industry in India; (b) making the infrastructure built through Indian National Satellite System (INSAT) available to a larger segment of the economy; (c) encouraging the investment by private sector in the space industry in India and; (d) attracting foreign investment in the satellite communication sector. The framework of the SATCOM Policy also laid the road map for authorising INSAT capacity to be leased to non-government parties, allowing Indian parties to provide services like uplinking of TV through Indian satellites, authorising Indian administration to inform and register satellite systems and networks and authorising operation of foreign satellites from India.
As the SATCOM Policy did not specify the manner in which the policy can be implemented, the Department of Space, in the year 2000, formulated the norms, guidelines and procedures for implementing the framework of SATCOM Policy. The norms and guidelines issued by the Department of Space focused on the use and development of the INSAT network, preferential treatment to Indian satellites, allocation of capacity for use of Indian satellites by private market players etc.
Developments in the cloud computing industry move at a pace that can be maddening to follow and impossible to predict.But some big-picture trends that will characterize the market for the next year are coming into focus, even if the technologies that ultimately enable them and vendors that drive them seem constantly in flux and vulnerable to disruption.Many of those emerging cloud computing trends stem from the industry entering a phase of standardization and increased compatibility—a sign of maturity in any tech sector.Cloud infrastructure—public, hosted private and on-premises—is increasingly less siloed, allowing workloads to be more portable and data streams more mobile.That standardization, largely thanks to the open-source movement, is allowing a shift in focus up the stack, with new channel roles emerging to support application-level processes, from enabling artificial intelligence and high-performance computing, to delivering novel SaaSOps and application development services.
Energy is the scarcest resource in ad hoc wireless networks, particularly in wireless sensor networks requiring a long lifetime. Intermittently switching the radio on and off is widely adopted as the most effective way to keep energy consumption low. This, however, prevents the very goal of communication, unless nodes switch their radios on at synchronized intervals—a rather nontrivial coordination task. In this article, we address the problem of synchronizing node radios to a single universal schedule in wireless mobile ad hoc networks that can potentially consist of thousands of nodes. More specifically, we are interested in operating the network with duty cycles that can be less than 1% of the total cycle time. We identify the fundamental issues that govern cluster merging and provide a detailed comparison of various policies using extensive simulations based on a variety of mobility patterns. We propose a specific scheme that allows a 4,000-node network to stay synchronized with a duty cycle of approximately 0.7%. Our work is based on an existing, experimental MAC protocol that we use for real-world applications and is validated in a real network of around 120 mobile nodes.
Structural health monitoring (SHM) systems have shown great potential to sense the responses of a bridge system, diagnose the current structural conditions, predict the expected future performance, provide information for maintenance, and validate design hypotheses. Wireless sensor networks (WSNs) that have the benefits of reducing implementation costs of SHM systems as well as improving data processing efficiency become an attractive alternative to traditional tethered sensor systems. This paper introduces recent technology developments in the field of bridge health monitoring using WSNs. As a special application of WSNs, the requirements and characteristics of WSNs when used for bridge health monitoring are firstly briefly discussed. Then, the state of the art in WSNs-based bridge health monitoring systems is reviewed including wireless sensor, network topology, data processing technology, power management, and time synchronization. Following that, the performance validations and applications of WSNs in bridge health monitoring through scale models and field deployment are presented. Finally, some existing problems and promising research efforts for promoting applications of WSNs technology in bridge health monitoring throughout the world are explored.