Altoroslabs Technology Blog | Latest News in Custom Software Development https://www.altoroslabs.com/blog Wed, 05 Jul 2023 18:09:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 https://www.altoroslabs.com/blog/wp-content/uploads/2023/02/favicon.ico Altoroslabs Technology Blog | Latest News in Custom Software Development https://www.altoroslabs.com/blog 32 32 LF’s Project Alvarium: Ensuring Trusted IoT Data at the Edge with DLT https://www.altoroslabs.com/blog/lf-project-alvarium-ensuring-trusted-iot-data-at-the-edge/?utm_source=rss&utm_medium=rss&utm_campaign=lf-project-alvarium-ensuring-trusted-iot-data-at-the-edge https://www.altoroslabs.com/blog/lf-project-alvarium-ensuring-trusted-iot-data-at-the-edge/#respond Wed, 24 May 2023 12:11:13 +0000 https://www.altoroslabs.com/blog/?p=825 Supported by Dell, IOTA, Intel, etc., this open-source project assigns trust scores to sensor data, using distributed ledger as immutable storage.

The post LF’s Project Alvarium: Ensuring Trusted IoT Data at the Edge with DLT first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
(Featured image credit)

Supported by Dell, IOTA, Intel, etc., this open-source project assigns trust scores to sensor data, using distributed ledger as immutable storage.

Background: the issue of trust

As the number and variety of IoT devices grows exponentially, so does the amount of data produced and shared between different systems/organizations. However, the information gathered may be inaccurate, incomplete, outdated, tampered with, or maliciously manipulated. E.g., AI tools can be used to automatically generate fake data that looks too real for humans to tell the difference. So, how can we ensure that the information we use is valid and reliable?

Currently, the answer to this question involves using a zero-trust approach, where all data is considered untrustworthy until proven otherwise. While this option works for security, the approach can be difficult for scaling IoT systems having thousands of devices.

Understanding the problem, back in 2019, Steve Todd of Dell was actively exploring the ideas of applying trust insertion technologies to IoT scenarios, looking for ways to ensure data confidence at the edge. Eventually, the experiments within Dell led to a concept of data confidence fabric (DCF)—an architecture that could help to measure how trustworthy sensor data is as it moves from devices to apps.

In August 2019, the first DCF prototype was written by using the Go language, EdgeX Foundry, Dell Boomi, VMware Blockchain, Lightwave, etc. In October 2019, Dell teamed up with IOTA Foundation to submit the contribution of DCF’s code base to the Linux Foundation, seeding Project Alvarium. A year later, a comprehensive paper was delivered, summarizing the vision behind the project.

However, the plans behind Alvarium were put on hold due to the COVID-19 pandemic, the official wiki says. Still, in October 2020, Dell and Intel delivered a DCF proof of concept intended for use in a privacy-preserving computer vision system for smart cities. By February 2021, Dell’s initial DCF was reengineered with nonproprietary DLT tools, such as IOTA’s Tangle and Streams, to align with the project’s open strategy.

A POC created by Dell and Intel for AI and IoT scenarios (image credit)

In October 2021, Alvarium found its home at LF Edge—an umbrella organization within the Linux Foundation, working on an open, interoperable framework for edge computing. The project is currently at Stage 1 of development, “At Large.” This means that it has been accepted by the Technical Advisory Council and has a clear alignment with the LF Edge mission statement.

After LF Edge welcomed the project, during 2021–2022, the team focused on a pilot real-life DCF implementation for a biodigester in Molina, Chile. This plant processed organic feedstock to generate energy, with methane reduction data traveling from sensors over a DCF to the server running the Alvarium SDK. Since then, a few webinars and conference sessions shed some light on the details of the implementation.

How Project Alvarium works

The goal of Project Alvarium is to create an open framework and an SDK for developing a data confidence fabric. In brief, the code helps to annotate IoT/sensor telemetry streams with trust metadata and confidence scores, using a distributed ledger for immutable storage. Users can then compare and prioritize different sources of data based on the trust scores, as well as automate audit and compliance.

In a webinar held in March 2023, Steve Todd explained that the trust scores take into account the source and quality of the data, the security of the data transmission, compliance, reputation of the providers, etc. Alvarium considers the presence of a trusted platform module (TPM), hash, authentication, TLS encryption, signature verification, and immutable storage.

The process of assigning a confidence score (image credit)

The importance of collaboration over creating an environment of trust is denoted in the name Alvarium, which is the Latin word for “beehive.” The framework can also combine multiple trust fabrics to generate confidence scores based on how the data was collected and processed through different stages and actors. According to Steve, the Alvarium framework is intended to provide an algorithm for calculating confidence scores, rather than the one and only DCF. Instead, each organization is expected to set up a DCF with trust insertion components that fit the company’s specific needs.

“No single entity can own the trust—after all, imagine if one company owned the Internet.” —Steve Todd, Dell

Dell notes that the TPM is essential for establishing a hardware root of trust, as it is responsible for signatures on device data, secure boot, and secure onboarding. The latter refers to performing an initial handshaking protocol between the TPM and the management software that oversees the health of gateway devices. A DCF can also use a different hardware root of trust, such as a trusted execution environment that runs apps in a secure manner.

To protect against tampering, the fabric can verify whether a hash was generated for a piece of data or not and determine which hashing algorithm was used. Furthermore, a DFC can identify and record threat activity and take it into consideration when assigning a confidence score.

Whenever a piece of data passes through an edge node, it calls an API that detects and annotates the operation being performed. Alvarium can check if a device has secure authentication and can request logs describing access attempts. According to Dell and Intel’s white paper, the DCF collects this and other additional metadata and attaches it to the packet. Principal Architect behind Project Alvarium Trevor Conn explained the process in a virtual meeting on March 13, 2023.

“We came up with a concept called trust insertion points, that are essentially places along that path that the data is taking where we can perform these annotations. At the end of the data flow, there’s a measure of trust that’s calculated resulting from the annotations that are captured at each insertion point.” —Trevor Conn, Dell

Example of end-to-end trust insertion points (image credit)

Trust scores are stored in a distributed ledger (DLT), such as IOTA’s Tangle or Ethereum. This way, the storage provides immutability, transparency, and auditability.

A pilot implementation for a wine producer

During a webinar in March 2023, the project’s team shared the details of a pilot implementation for VSPT Wine Group—one of the largest wine producers in Chile. The group operates the world’s first biogas plant that uses harvest waste as its only fuel to provide the Viña San Pedro winery with 60% of its energy needs. The company used a set of sensors to provide an accurate real-time view of the facility’s carbon footprint. However, creating certified emission reduction statements required a solution that could verify the accuracy of collected data.

Usually, verifying emission statements involves a person coming to the facility and taking manual measurements with a clipboard and a spreadsheet. The process is called measurement, reporting, and verification (MRV)—for this facility, it took 24–48 months. In addition to taking a lot of time, manual MRV processes can leave a significant amount of room for human error. Mathew Yarger—a cofounder of the DigitalMRV platform and an advisor at IOTA—demonstrated a survey estimating an average error rate of 30–40% in the emissions measurements.

Furthermore, a 2019 study revealed that the US fertilizer industry produced 29,000 tons of methane emissions annually. This is over 100x more than their self-reported estimate of 200 tons.

“There is a huge number of errors that come from the auditing practices and the methodologies that are used to quantify the emissions and what’s happening. Because they are heavily analog and it’s not taking data in real time, there is always room for human error…Organizations are reporting the emissions that they think they’re contributing into the atmosphere, and they’re trying to account for those appropriately, but then it comes out that it’s a drastically different number.” —Mathew Yarger, DigitalMRV

Alvarium-backed data flow for the biogas plant in Molina (image credit)

To address these issues, VSPT Wine group collaborated with Dell, ZEDEDA, IOTA, ClimateCHECK, and DigitalMRV to automate the MRV process and ensure data confidence/compliance. As a result, the new system provided transparency and auditability, while reducing the time required for MRV to just 4–6 weeks. This solution leverages Project Alvarium’s DCF and Project EVE’s ability to bring cloud computing to remote edge locations.

When presenting the results, Kathy Giori of ZEDEDA also emphasized the importance of having security “running all the way down to the bare metal.” A Dell server with EVE installed was shipped to Chile and connected to the network. This served as a control plane, allowing the team to update and manage applications without the need to go on-site. Project Alvarium provided a secure application plane by helping to give trust scores to all data inputs and generate a methane emission reduction statement.

“EVE locks down the bare metal, IOTA secures and routes the data, and Alvarium applies the confidence score so that the auditors know they can trust the source.” —Kathy Giory, ZEDEDA

A dashboard showing gathered data along with confidence scores (image credit)

Project Alvarium’s DCF built on the IOTA Tangle is used to gather data, run it through Alvarium’s algorithm, and then store the results on a ledger. During the webinar, Steve Todd pointed out that the system takes in data from sensors, as well as manual readings (taken by people). This introduces additional factors that Alvarium has to account for.

When it comes to sensor data, Alvarium is able to record the levels of security as the information travels through the entire system, resulting in an accurate trustworthiness score. As for the data taken by facility personnel, the system can only measure the security of the server where the information was entered. As a result, manual readings tend to have a lower confidence score.

“Adding data to an immutable ledger can increase confidence, because it brings tamper resistance, but it doesn’t quantify how the data is managed along the way. If you’re taking bad data and putting it into a ledger it’s a little more secure but it’s still bad data.” —Mathew Yarger, DigitalMRV

The types of data gathered by the Digital MRV system (image credit)

Future plans

According to a wiki at LF Edge, in 2023, the project’s team aims to focus on growing adoption, partner engagement, and making Alvarium more flexible and applicable to use cases beyond the IoT.

Aligning with this strategy, on February 7, 2023, Dell joined the Hedera Governing Council. The organization stands behind Hedera, an open-source public ledger, which could help Alvarium with contract automation, ESG reporting, and other R&D activities, the announcement says.

Currently, the development of Alvarium is governed by a Technical Steering Committee (TSC), which was formed after the project was admitted to LF Edge in 2021. The TSC meets biweekly to discuss the roadmap, architecture, potential partnerships, etc.

During a virtual meeting on March 13, 2023, the TSC discussed the issue of reducing duplication in trust scores. Right now, device-related information (the presence of a TPM, secure boot, etc.) is annotated for every piece of data. However, because these hardware factors cannot change between two points of data capture, the information is redundant. As such, Trevor Conn proposed associating such data with the device profile of confidence instead.

A solution using IoT for emissions tracking (image credit)

Another proposed improvement involved operationalizing the confidence score of a software artifact resulting from a CI/CD pipeline. Here, Trevor Conn mentioned using a software bill of materials to ensure that a given build was created successfully and that there were no exploits in the pipeline. In the future, this could help orchestrators to make decisions about whether or not to deploy or use a particular piece of software. For example, companies can create a rule to block any workload that has a trust score lower than 90%.

During the meeting on March 27, 2023, the TSC discussed Secure Relationship Protocol Network Operating System (SRPNetOS). The project uses bills of materials, which contain a list of all the hardware and software components used to build a particular product, application, or network. A DCF could use this information to check if any of the components have unpatched vulnerabilities and assign a more accurate trust score.

Example architecture of a policy-driven network (image credit)

Those who want to learn more about Project Alvarium, can find its code in this GitHub repository. Dell’s technical paper explains the vision behind Alvarium, and there is also an overview of the project in the wiki. If you want to follow the project more closely, check out its mail lists for TSC-related and other items, as well as official Twitter and LinkedIn pages. There is also an #alvarium channel in the LF Edge Slack workspace.

Want details? Watch the videos!

In this March 2023 webinar, Kathy Giori, Steve Todd, and Mathew Yarger presented the pilot implementation of Project Alvarium for a biodigestion plant in Chile.

During another webinar (August 2022), Kathy Giori went into more technical details on how the biodigestion plant works, as well as the open-source IoT technologies at play.

Frequently asked questions (FAQ)

  • Why is there an issue of trust with IoT data?

    Data that moves from an IoT sensor to the end app passess though other devices and systems, and each one of those could be used to tamper with the information. Furthermore, an attacker can use advanced AI tools to generate extremely realistic fake data that can easily fool a human.

  • What’s wrong with a zero-trust approach for IoT?

    A zero-trust approach means treating all data as untrustworthy until proven otherwise. This is extremely difficult to scale for an IoT system with thousands of devices sending packets every few seconds.

  • What is a data confidence fabric (DCF)?

    A DCF is a loosely coupled collection of trust insertion technologies that measure how trustworthy data is as information moves from devices to apps. The system assigns confidence scores based on the presence of a TPM (trusted platform module), secure boot, TLS encryption, signature verification, hash, etc.

  • Which companies work on Project Alvarium?

    Project Alvarium was developed internally at Dell and donated to the Linux Foundation. Now, in addition to Dell, the project is supported by a number of industry leaders, including IOTA Foundation, Intel, Arm, IBM, VMware, OSIsoft, Unisys, etc. The team behind Alvarium also collaborated with ZEDEDA, ClimateCHECK, and DigitalMRV when creating the first pilot implementation for VSPT Wine Group.

Further reading

About the experts

Steve Todd is Fellow and VP of Data Innovation and Strategy in the Dell Technologies Office of the CTO. He is a long-time inventor in HiTech, having filed over 400 patent applications with the USPTO. Steve’s innovations and inventions have generated tens of billions of dollars in global customer purchases. His current focus is the trustworthy processing of edge data. As a cofounder of Project Alvarium, Steve is driving the exploration of distributed ledger technology with partners such as IOTA and Hedera. He earned Bachelor’s and Master’s Degrees in Computer Science from the University of New Hampshire.

Trevor Conn is Distinguished Engineer at Dell Technologies. He is currently working as Research Lead for Digital Twins in the Dell Office of the CTO and assists in the design and delivery of distributed edge-to-cloud solutions. Trevor has extensive experience with .NET (C#), ASP.NET / MVC / WebAPI, Go, MQTT, RabbitMQ, Apache Kafka, SQL Server, MongoDB, NServiceBus, and ElasticSearch. Being a cofounder of Project Alvarium, he currently works as its principal architect and engineer.

Mathew Yarger is Cofounder of DigitalMRV and Advisor at IOTA. He has a diverse computer science and network security background, as well as experience with blockchain, DLT, and Distributed Systems Theory. Mathew has expertise in enabling flexible and privacy-centric data transfer for smart cities, critical infrastructure, environmental and energy systems, as well as autonomous vehicles. At the IOTA Foundation, he worked in leadership and engineering positions connected with sustainability, smart cities, smart mobility, and security. He was Director of Cybersecurity at Geometric Energy Corporation, and Digital Forensic Examiner at the DoD Cyber Crime Center Cyber Forensics Lab while working for General Dynamics Mission Systems. Before that, Mathew served as Cybersecurity Analyst in the US Army.

Kathy Giori is Director of Product Engineering at ZEDEDA and CEO at Tricyrcle. Kathy has a background in open-source collaboration, embedded Linux platform strategy, and securely managed IoT product definition, integration, and testing. At Mozilla, she was Senior Product Manager and Staff Developer Evangelist. Kathy also served as Vice President of Operations at Arduino, working on open-source collaboration and embedded Linux platform strategy. She also holds an MSEE from Stanford University and a BSEE from the University of Minnesota.


This article is written by Yaroslav Goortovoi and Alex Khizhniak.

The post LF’s Project Alvarium: Ensuring Trusted IoT Data at the Edge with DLT first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
https://www.altoroslabs.com/blog/lf-project-alvarium-ensuring-trusted-iot-data-at-the-edge/feed/ 0
Alternatives to Google Cloud IoT Core—Where to Migrate? https://www.altoroslabs.com/blog/alternatives-to-google-cloud-iot-core-where-to-migrate/?utm_source=rss&utm_medium=rss&utm_campaign=alternatives-to-google-cloud-iot-core-where-to-migrate https://www.altoroslabs.com/blog/alternatives-to-google-cloud-iot-core-where-to-migrate/#respond Wed, 26 Apr 2023 04:50:31 +0000 https://www.altoroslabs.com/blog/?p=758 Google will sunset its IoT Core on August 16, 2023. Some of the edge devices may become unavailable, experts say.

The post Alternatives to Google Cloud IoT Core—Where to Migrate? first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
(Featured image credit)

Google will sunset its IoT Core on August 16, 2023. Some of the edge devices may become unavailable, experts say.

What happened and why the urgency

Last year, Google announced that it would retire the Cloud IoT Core service soon, giving customers one year to migrate. It was also noted that the documentation for the product will no longer be available after August 15, 2023.

Until now, Cloud IoT Core was used by customers for device authentication, communication with the edge, updating configuration, data ingestion, etc. The main components of the product include a device manager and two protocol bridges (MQTT and HTTP) for connecting to Google Cloud Platform (GCP). Telemetry data is forwarded to a Cloud Pub/Sub topic and can then be used for analysis with other GCP services.

A typical workload using IoT Core (image credit)

Since the announcement, companies have put their effort into migrating off the platform. Most likely, the majority of them have already either finalized the migration or at least are in the process. Last month, a Reddit user posted a message mentioning 10,000 devices under high load that are still in ongoing migration due to complexity.

So, if you had reasons to avoid moving the IoT workloads until now or your migration failed and you are looking for a better technology option, read on.

Note: Google is not alone in deprecating IoT services. Earlier, IBM decided to discontinue the Watson IoT platform, while SAP has recently shut down its suite previously known as Leonardo.

 

 

What’s the plan?

After the anxiety the news from Google brought in last year, there were a lot of articles for CTOs, architects, and IoT engineers about possible further steps. While a lot of overviews suggested migrating the system to Azure, AWS, or another cloud platform (sometimes, completely), in reality many other options do exist.

Here’s why. Within particular deployments, the main functions of Cloud IoT Core are device management, data ingestion, and messaging. Depending on what of these you are using, in its documentation, Google recommends different migration options.

In brief (very simplified):

  • If you rely on IoT Core for device management and data ingestion, you may need additional tools for these scenarios now (or a single platform combining the both requirements).
  • For device communication, you can go with an MQTT broker.

The document notes that the hardest job will be to migrate the processing of devices that both produce and consume data for / from back-end workloads.


A standalone MQTT broker on GCP (image credit)

In addition, updates on the devices may also be needed on the edge. If not done before August 2023 over the air, this may require updating devices in the field, manually. If not updated, you may “consider them lost,” experts from Leverege write in their IoT Core migration guide.

Major alternative options and scenarios

So, when it comes to exact products and implementations, there are at least four possible groups of options for replacing IoT Core.

  1. First of all, in the official announcements, Google noted that they see the partner ecosystem as the main driver for further development of this area. Therefore, the suggested option from Google is to choose among its partners. The product’s page features a list of preferred tools and platforms—including Aeris, EMQX, FogLAMP, KORE OmniCore, Leverege, Litmus, ThingsBoard, etc.

    A combination of these solutions is also possible (e.g., this tutorial explains how to integrate EMQX with ThingsBoard).

    Communication via an EMQX cluster (image credit)

    On top of that, new products from Google partners have appeared, such as ClearBlade IoT Core. (Here’s a technical overview of this automated offering and a migration tutorial.) SOTEC, in its turn, delivered a dedicated solution based on its contributions to open-source platform Eclipse Hono. SoftServe created IoT EnCore (utilizing VerneMQ), so the options exist.

    How ClearBlade IoT Core works (image credit)
  2. You can also turn to platform providers outside the “featured” list, of course. Vendors from different verticals are competing for the privilege of acquiring IoT Core users. Many of them are Google partners, too. Some publish explanations why or how to migrate, others offer incentives.

    For instance, users migrating from Google IoT Core to the akenza platform can get a 10% discount for one year. Qubitro ensures six months of free credits for migrating to their platform.

    Other products that can be considered as alternatives to IoT Core—according to their teams—include Losant, Davra, Kaa, CLEA (SECO), Cloud Studio, LTIMindtree IoT Core+, Mapify IoT, Rayven Dynamix, etc. Large vendors such as Siemens and Software AG are also ready to host new customers on their Insights Hub and Cumulocity IoT. The choice depends on how much of their functionality you actually need, as well as the cost and other criteria essential for your project.

    (As a Google partner in cloud consulting, Altoros can also help.)

  3. Another option, yes, is using similar IoT services from other major cloud platforms, such as Azure IoT Hub or Amazon IoT Core. However, you may have your own reasons to avoid the multicloud approach—whether it’s cost, compatibility, complexity, etc. Still, if you already have a muticloud deployment or have proper expertise / resources, this really is an alternative.

    Device management with Azure IoT (image credit)

    (Note: The funny thing is that a number of articles even suggested migrating to IBM Watson IoT, which will also be suspended on December 1, 2023. Previously, the product was competing with Azure and AWS against their IoT services. However, this is not an option anymore, of course.)

  4. Finally, you can rely on messaging adapters / MQTT brokers. For instance, HiveMQ released the Google Pub/Sub Extension for this particular purpose. The company evaluated the performance of the extension at 50,000 MQTT messages per second (which means up to 4.32 billion messages a day).

    Soracom offers a similar approach, ensuring direct M2M communication of devices with a cloud platform through the Funnel adapter (see the docs). Open-source options for MQTT include Mochi, Mosquitto, etc. (See also the reasoning for the Mosquitto Pro version.) A broker from Google partner EMQ was mentioned earlier.

    HiveMQ Enterprise Extension for Google Cloud Pub/Sub (image credit)

    However, in this scenario, you may still need a separate device management solution in case your requirements include firmware updates, provisioning, monitoring, etc.

Hurry up, but do it wisely

There’s certain time pressure for this migration initiative, however it makes sense to take the most out of this unexpected situation.

First, there are chances that the requirements for the existing deployment have already evolved since the time your MVP/pilot started using Google Cloud IoT Core. Second, when choosing among IoT platforms or messaging brokers, you have both proprietary and open-source options. This means the variety of tools can help you adjust your architecture to your current needs and budgets. Especially in case they were affected by the ongoing crisis.

The choice of the technology depends on your implementation, but it is also a chance to review the environment architecture and optimize accordingly.

If you’re still coping with the migration (due to the size, complexity, or failures) or expect that some of the edge devices will be “bricked” after IoT Core’s EOL, I’d be glad to learn about your experience here, here, or in the comments below. I’m ready to add your story to this article.

Further reading

The post Alternatives to Google Cloud IoT Core—Where to Migrate? first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
https://www.altoroslabs.com/blog/alternatives-to-google-cloud-iot-core-where-to-migrate/feed/ 0
46% of Medical IoT Devices Have a Vulnerability, a New Study Reveals https://www.altoroslabs.com/blog/46-of-medical-iot-devices-have-a-vulnerability-a-new-study-reveals/?utm_source=rss&utm_medium=rss&utm_campaign=46-of-medical-iot-devices-have-a-vulnerability-a-new-study-reveals https://www.altoroslabs.com/blog/46-of-medical-iot-devices-have-a-vulnerability-a-new-study-reveals/#respond Mon, 17 Apr 2023 17:08:57 +0000 https://www.altoroslabs.com/blog/?p=732 For 53% of institutions, breaches in connected systems affect patients—e.g., lead to delayed surgeries or transfers to other facilities.

The post 46% of Medical IoT Devices Have a Vulnerability, a New Study Reveals first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
(Featured image credit)

For 53% of institutions, breaches in connected systems affect patients—e.g., lead to delayed surgeries or transfers to other facilities.

Types of IoT devices

In the pursuit of personalized care, medical organizations are adopting IoT devices to collect health-critical information in real time and improve the quality of treatment. A February 2023 report from Cynerio revealed that the most popular types of connected device in hospitals are IV pumps, patients monitors, and glucometers.

More institutions are integrating the readings from remote patient monitoring equipment with centralized medical systems. According to a November 2022 study by CHiME, almost 90% of organizations send EKG information directly to an electronic health records (EHR) system. Additionally, 80% upload blood pressure information, and 76% integrate medication dispensing information. The research also shows that 51% of healthcare organizations integrate wearables data with EHRs.

Most common types of IoT devices in a hospital (image credit)

At the same time, medical IoT devices use a variety of operating systems. CHiME indicates that Linux is by far the most popular, with a 46% share. However, more than half of devices use a variety of heterogeneous proprietary platforms, as well as outdated ones, such as Windows CE. In general, 82% of respondents in a Capterra survey (2022) noted medical devices running an operating system from Microsoft older than Windows 10.

The variety of operating systems used in medical IoT devices (image credit)

As the use of robotics, sensors, and digital technologies continue to grow, new opportunities for exploitation emerge. A February 2023 report from Health-ISAC found that hospitals with a higher number of connected medical devices faced more cyberattacks and had increased chances of multiple incidents.

This article explores the most common types of vulnerabilities, sheds some light on what to expect in 2023, as well as provides recommendations from IoT security experts.

Types of vulnerabilities to look out for

In April 2022, five software glitches were found in the firmware of TUG robots used by a range of US hospitals. These could enable criminals to control the machines, as well as lock doors and elevators, creating threats to patients and staff. Attackers could also gain access to medical documents, surveillance footage, and spread the malware to other devices by taking over administrative sessions in the robots’ online portal. Although the issue was patched quickly, other vulnerabilities may still be undiscovered.

Unfortunately, some IT security issues lead to more severe consequences. In February 2023, a cyberattack on the Tallahassee Memorial Hospital in Florida forced the organization to shut down its entire computer system. As a result, the institutions had to send emergency patients to other facilities and cancel some nonemergency surgeries.

The integration of IoT devices has increased the exposure of healthcare to cybersecurity risks. A February 2023 study by Cynerio showed that 46% of medical devices had an unaddressed vulnerability. Furthermore, 11.7% of devices had at least one critical risk. Consequently, healthcare organizations with a higher percentage of connected medical devices are more vulnerable to cyberattacks.

One reason for these vulnerabilities is that healthcare organizations often fail to implement basic security measures. In Capterra’s report, only 43% of respondents said they changed default passwords on IoT medical devices, and just 32% applied the latest updates.

Common medical IoT device vulnerabilities (image credit)

A January 2023 research paper by Nozomi Networks warns that the large number of vulnerable IoT devices that could be compromised remotely pose an additional challenge. One of the main ways attackers get access to IoT devices is the use of default credentials. Since many organizations neglect to set new passwords, threat actors can simply log in without needing to use exploits or social engineering.

The most common login–password combinations include:

  • nproc:nproc
  • admin:admin
  • admin:1234
  • root:root
  • admin:
  • root:
  • admin:password
  • admin:123456
  • admin:12345
  • user:user

Let’s explore how these gaps in security could be exploited by hackers.

Attacks to expect

A January 2023 report by Nozomi Networks and a study by Health-ISAC predict six types of attacks that healthcare organizations are likely to face this year:

  • distributing ransomware
  • DDoS attacks
  • intercepting sensitive information during transmission
  • abusing exploits in medical devices
  • finding devices with weak/default login credentials
  • creating synthetic accounts to circumvent security systems

The report from Nozomi Networks also notes that AI models, such as ChatGPT, could also pose security risks. This is because such systems allow even unskilled attackers to create malicious code or generate realistic-looking phishing e-mails.

Ransomware attacks are on the rise. A January 2023 survey by the Ponemon Institute found that 47% of medical organizations experienced a ransomware attack last year. Out of those attacked, 46% state that the breach was caused by a third party. Shockingly, the average ransom amount—paid by 67% of affected organizations—was a staggering $352,541.

For 53% of respondents, a ransomware attack affected patient care. 45% of organizations experienced complications from medical procedures due to ransomware attacks in 2022. The average disruption length measured 35 days.

Besides, 70% of respondents had to transfer patients to another facility because of an attack, and 68% said this increased their length of stay. Furthermore, delays in treatment resulted in poor medical outcomes for 58% of organizations, and 21% said that ransomware affected mortality rates. A report by Capterra also confirms these findings, discovering that 48% of healthcare cyberattacks impact patient care, and 67% affect personal data.

A February 2023 report by Health-ISAC (Information Sharing and Analysis Center) outlines a few potential attack vectors, as well. For instance, cybercriminals can use credential lists, which are freely available on many forums, to gain access to health records. Hackers also take advantage of compromised personal information to create synthetic accounts for bypassing identity checks, paying medical bills, stealing health data, etc.

The effect of ransomware attacks on patient care (image credit)

Security recommendations

As cyberthreats continue to grow in frequency and sophistication, healthcare organizations must prioritize proactive defense strategies to protect their networks and data. Security is considered an essential or high-priority target for digital transformation in 2023, according to a CHiME report.

Priority in healthcare digital transformation (image credit)

Nozomi Networks recommends prioritizing network segmentation, asset discovery, vulnerability management, patching, logging, endpoint detection, and threat intelligence.

Health-ISAC suggests performing risk assessments and developing threat models to identify how devices may be susceptible to cyberattacks. The company also advises hospitals to change default or weak login credentials, and apply patches as soon as they become available. Engaging with caregivers to better understand the potential impact a breach can have on patient care is also essential.

Additionally, the Ponemon Institute provides some advice for protecting against ransomware attacks. With 46% stating that the intrusion was caused by a third party, it is crucial to have policies for assessing security of external systems (e.g., IoT devices), eliminating gaps, and recovering. 

Finally, organizations must also be aware of threats to customer-facing products, which are routinely targeted by attacks designed to extract data. Capterra advocates for thorough monitoring of medical device data and traffic, along with aligned controls at the network, application, authentication, and risk layers. These measures should reduce the risk of credential stuffing, account takeovers, carding attacks, and synthetic account creation.

(Read how a medical software provider secured its spirometry/oximetry system with authentication token hashing, encryption, biometric sign-in, and controlling Bluetooth connectivity. Or, check out a story of a startup that produces blood sampling kits, which created a HIPAA-compliant IoT prototype involving barcode-scanning and temperature monitoring.)

To summarize, experts recommend medical organizations to take the following steps:

  • always apply the latest updates and patches
  • change default login credentials and strengthen passwords
  • create policies for auditing third-party security
  • establish processes for recovering after an attack with minimal disruption
  • perform regular assessments to identify and patch vulnerable devices
  • ensure traffic monitoring to reduce the risks from synthetic accounts

In conclusion, proactive defense strategies and threat modeling are indispensable for critical infrastructure and healthcare institutions. Since an attack on medical devices endangers people’s health, investing in robust security measures can eventually save human lives.

Frequently asked questions (FAQ)

  • What are the most common types of IoT devices used in hospitals?

Cynerio’s February 2023 report shows that the most popular types of medical IoT devices are IV pumps (used in 38% of hospitals), patients monitors (19%), and glucometers (7%). Healthcare organizations can also have medicine dispensers, ultrasound machines, as well as IoT gateways, robots, location-tracking systems, etc.

  • What are the potential consequences of a ransomware attack on medical IoT devices?

A January 2023 survey by the Ponemon Institute found that the average victim paid a $352,541 ransom and experienced a 35-day disruption. As a result, 70% of respondents had to transfer patients to another facility, and 68% said that it increased their length of stay. Furthermore, 58% of affected hospitals admitted that treatment delays worsened medical outcomes, and 21% reported an increase in mortality rates.

  • What are the most common ways attackers gain access to medical IoT devices?

A February 2023 paper by Cynerio indicates that 46% of medical devices have an unpatched vulnerability, and 11.7% have a critical one. According to a study by Capterra, only 43% of respondents change default login credentials on their devices, and just 32% apply the latest patches. Health-ISAC also warns hospitals that attackers could use stolen personal information to create synthetic accounts and deceive security systems.

  • How to protect medical IoT devices from cyberattacks?

Nozomi Networks suggests prioritizing network segmentation, vulnerability management, patching, logging, threat scanning, etc. Health-ISAC advises hospitals to perform risk assessments and identify how devices may be susceptible to cyberattacks. The Ponemon Institute urges hospitals to create policies for assessing security of third-party systems. Capterra recommends increased data and traffic monitoring, along with aligned controls at the network, application, authentication, and risk layers.

Further reading


This article is written by Yaroslav Goortovoi and Alex Khizhniak.

The post 46% of Medical IoT Devices Have a Vulnerability, a New Study Reveals first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
https://www.altoroslabs.com/blog/46-of-medical-iot-devices-have-a-vulnerability-a-new-study-reveals/feed/ 0
Can Digital Tools Cut Long Wait Times as Banks Fail to Catch Up? https://www.altoroslabs.com/blog/whats-next-for-digital-banking-as-customers-want-faster-tailored-service/?utm_source=rss&utm_medium=rss&utm_campaign=whats-next-for-digital-banking-as-customers-want-faster-tailored-service https://www.altoroslabs.com/blog/whats-next-for-digital-banking-as-customers-want-faster-tailored-service/#respond Thu, 30 Mar 2023 14:34:52 +0000 https://www.altoroslabs.com/blog/?p=562 As 2022–2023 reports show that consumers wait for too long and expect much more from their providers, banks need to step up.

The post Can Digital Tools Cut Long Wait Times as Banks Fail to Catch Up? first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
(Feature image credit: Digital Banking Report)

As 2022–2023 reports show that consumers wait for too long and expect much more from their providers, banks need to step up.

The growing frustration with the banking sector

With the pace of the world today, it is no surprise that 76% of people expect to be engaged immediately when contacting their bank. However, in 2022, 39% of American respondents who visited a bank in person reported waiting more than they anticipated. What is worse, even after the queue, customers spend even more time filling out application forms and going through validation/verification, especially when opening an account.

Even though banks are taking steps to implement transformation initiatives, long wait times also affect digital channels. According to a 2022 study by Verint, 49% of potential customers start opening a banking account on the bank’s website. However, by the end of the day, the percentage declines to 34% for those who succeeded in finalizing the process. Most of those who failed to complete the registration end up with in-person assistance. Furthermore, 49% of respondents stated that resolving an issue with online access to their account took longer than expected.

This is confirmed by a 2022 report from Signicat: 68% of European consumers have abandoned the online account creation process midway through. Out of those people, 21% have cited the length of the registration process as the main reason, with another 21% saying that they were asked for too much personal information. Overall, 92% of respondents were concerned about the amount of data they share with a bank.

If manual customer verification procedures are involved, it can take weeks or even months before a person can actually use his/her account, a global 2022 report by Fenergo discovers. While the majority of requests are resolved within 1–2 months, in 22% of cases, the time spent on Know-Your-Customer (KYC) procedures may take 121–150 days.

Manual onboarding can take months (source: Fenergo)

With the long wait times and convoluted processes, it is no wonder that bank customer satisfaction falls by 3% year-on-year, according to a 2022 study. 30% of Gen Z and Millenials admit that they would switch banks if it required no effort on their part.

To stay competitive, traditional banks embrace digital transformation as a strategic priority, introducing online and mobile tools to enhance customer experience. The COVID-19 pandemic marked a turning point for online banking. According to S&P Global, US banks closed 2,927 branches in 2021 (over 2x more than in 2019). The use of digital tools for accessing personal finances has become a norm in Europe, too. For instance, a 2022 report from Ireland’s Department of Finance states that 63% of domestic customers are using online banking at least weekly.

The pandemic caused people to favor mobile banking (source: S&P Global)

So, what do consumers value for digital banking today and what does it mean for financial institutions? Read on to learn.

Customers expect security and personal insights

While customers are forced to wait in queues and go through lengthy validation, their expectations stretch beyond faster service. Today, people want banks to be proactive in providing financial guidance, reminding about regular payments, and ensuring security, surveys say.

Specifically, an August 2022 report shows that 53% of users want to receive notifications about upcoming payments, such as tuition, mortgage, auto loans, etc. 47% of consumers want banks to analyze personal income, expenses, and savings information to help make better financial decisions. Notably, 46% of respondents want institutions to act as a trusted advisor for major spendings, and 39% would like their bank to help them stay on budget by preventing odd purchases.

Consumers expect more proactivity and personalization (source: Digital Banking Report)

As for security, the survey from Verint observed that fraud prevention was one of the most highly valued factors when picking an institution in 2022. Customers expect banks to monitor transactions for suspicious activity (e.g., someone trying to use your card in another country) and take preventive action.

56% of consumers consider Social Security number breach notifications extremely valuable, as indicated by a December 2022 report from Insider Intelligence. Unusual account activity notifications are appreciated by 51% of respondents. At the same time, 46% of people aged 41 and above are not aware that security notifications exist, while 22% don’t know how to turn on this functionality, Verint notes.

In general, the importance of digital channels for consumers is growing, while the location of physical branches becomes less relevant. Furthermore, compared to 2021, American users started to favor mobile apps over banking websites in 2022, the study from Verint says. In Ireland, for instance, the prevalence is even more significant: 69% prefer mobile apps to a bank’s website (22%), a government-issued survey detailed in 2022.

The growing importance of fraud protection and mobile apps (source: Verint)

The missing pieces

According to a 2021 report by S&P Global, the customers use mobile banking apps for the following most common scenarios:

  • checking account balances
  • transferring money between accounts
  • depositing a physical check using the phone’s camera
  • reviewing transaction history and accessing detailed statements
  • paying bills and automating recurring transactions (for utilities, Internet, phone service, etc.)

People also often use banking apps to locate ATMs on a map, find a way to contact customer service, or schedule branch appointments in advance.

The most popular scenarios for using mobile apps (source: S&P Global)

Despite the importance of mobile experience, there is a lot of highly requested functionality still not implemented in many banking apps. E.g., S&P Global’s report notes that customers from the middle Atlantic region of the US specify cardless ATM access as the most lacking feature.

Other welcome additions may include seeing the balance without logging in, (un-)blocking cards, as well as voice commands. The latter may range from asking Google, Siri, or Alexa for your balance to using voice for inputting transaction details or confirming payments within the app. Customers also want the ability to set spending limits, report a card lost/stolen, order a new one, view credit score details, dispute unauthorized/fraudulent transactions, chat, etc. Some of the users also would like to have a smartwatch version of the app.

Users expect much more than banks actually implement (source: S&P Global)

As we can see, users expect a range of functionality that is not yet implemented by many banks. Therefore, to stay competitive, traditional players must adapt to customer needs at a much faster rate.

What does this mean for banks?

To serve customers faster, banks need to review their processes and identify bottlenecks that could be addressed with innovation and digitalization. For example, as the lengthy manual Know Your Customer (KYC) validations slows down account opening, banks need to find ways to automate the process.

In particular, AI-powered systems can be used to recognize data from customer documents, while blockchain and digital identity solutions can accelerate audits. Robotic process automation (RPA) could enable bank employees to facilitate document-related workflows. E.g., this story describes how a provider of investment management services automated manual extraction of necessary content from financial reports.

While KYC verification mostly affects onboarding, queues and inefficient processes concern existing customers, too. The functionality allowing for scheduling appointments and filling any necessary forms online in advance could greatly reduce the time spent in a branch.

The growing demand for personalization will lead to integrating data analytics tools helping to deliver financial insights based on a user’s income, spending, savings, etc. This is already confirmed by a 2022 survey by The Financial Brand indicating that banks prioritize security, mobile channels, data analytics, and AI for the next 3–5 years. 

Banks’ priorities in digital transformation (source: The Financial Brand)

Still, mobile apps packed with customer-facing features increase the complexity of banking systems and create new potential points of vulnerability. In a March 2023 report, the World Bank listed security as one of the main digital banking challenges, as cyberattacks can compromise business continuity, leading to reputational and economic damages. As such, financial institutions need to invest in operational resilience frameworks, the report suggests.

As more customers want to see innovative features in their apps, banks might run into technical challenges when integrating new software with existing, legacy systems. While mobile banking apps tend to be built with modern frameworks and programming languages, the same cannot be said about the core banking software. Based on technologies created decades ago (e.g., COBOL), many of these systems are difficult and expensive to maintain or integrate with newer applications.

That’s why 69% of banks do not intend to replace legacy core systems as part of their digital transformation, according to the Digital Banking Report issued in June 2022. McKinsey estimates that switching to a new core system could cost large banks around $300–400 million, while medium-sized institutions may still need around $50 million.

For those banks that decide to update core software, a microservices architecture can make future maintenance easier. By breaking down the monolithic core software into smaller components, banks will be able to add new features without having to replace the entire system. McKinsey recommends banks to conduct the transformation in small, incremental iterations, rather than trying to replace the entire system at once.

The majority of banks do not plan to replace core systems (source: Digital Banking Report)

Finance is one of the most heavily regulated industries, so maintaining compliance can also be challenging when developing mobile apps. Banks need to follow region-specific laws, such as the Electronic Fund Transfer Act (EFT) or the Electronic Signatures in Global and National Commerce Act (ESIGN) in the US. In the EU, digital banking software needs to adhere to the Strong Customer Authentication (SCA).

Consumer needs transform workforce/operations

Banks are accelerating digital transformation, as customers preferences shift to a personalized and seamless experience across mobile channels. The competitive landscape is also shifting rapidly, as traditional players are facing pressure from new entrants, such as FinTech companies and neobanks.

McKinsey recommends banks to work like a technology company, adopting a “digital-first” mindset and streamlining processes to meet customer expectations. This will have significant implications for the banking workforce—e.g., increasing the need for technology-related skills. Organizations will likely need to train employees in data analytics and other intelligent tools to provide insights and act as customers’ financial advisors.

Technology is expected to change the banking workforce (source: McKinsey)

At the same time, automation will inevitably have an impact on many routine tasks, such as invoicing, accounting, identity verification, and customer service. As a result, McKinsey estimates that a typical branch staff composition will shift away from tellers (comprising 36% now) to universal bankers (50% by 2030).

To introduce enhanced digital services (like this one), banks will require more developers to build and maintain new products in an agile manner, the report projects. As banks adopt cloud platforms for managing infrastructure and IT operations, McKinsey predicts changes in the composition of the technology workforce, too. By 2030, the share of IT operations and infrastructure engineers is expected to decline from 47% to 20%, while developers go from 30% to 45%.

Mobile apps matter when choosing a bank (source: S&P Global)

In this digital transformation, customer expectations around analytics, new innovative services, and availability of mobile tools will play a key role. The importance of the mobile experience for users is confirmed by an S&P Global survey across all regions of the US. 13.4%–24.9% of respondents stated that a better mobile app would make them consider switching banks. So, faster adoption to digital consumer demands, fraud protection, and features driving personalization may help banks to stay competitive.

Further reading


This article is written by Yaroslav Goortovoi and Alex Khizniak.

The post Can Digital Tools Cut Long Wait Times as Banks Fail to Catch Up? first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
https://www.altoroslabs.com/blog/whats-next-for-digital-banking-as-customers-want-faster-tailored-service/feed/ 0
What Can We Learn from Stanford Hospital’s Use of IoT and Data? https://www.altoroslabs.com/blog/stanford-hospital-speeds-up-care-with-iot-ai-and-telehealth/?utm_source=rss&utm_medium=rss&utm_campaign=stanford-hospital-speeds-up-care-with-iot-ai-and-telehealth https://www.altoroslabs.com/blog/stanford-hospital-speeds-up-care-with-iot-ai-and-telehealth/#respond Thu, 23 Mar 2023 21:03:45 +0000 https://www.altoroslabs.com/blog/?p=556 By investing $2B in innovations, the medical institution is creating a smart hospital—still, facing technological issues on this journey.

The post What Can We Learn from Stanford Hospital’s Use of IoT and Data? first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
(Featured image credit)

By investing $2B in innovations, the medical institution is creating a smart hospital—still, facing technological issues on this journey.

The need to improve healthcare facilities

Stanford Hospital is a hundred-year, world-renowned medical institution located in California. The hospital is part of the Stanford University Medical Center and has a long history of providing patient care and conducting research. During a recent decade, the organization underwent a $2.1 billion project to update its facilities and enhance patient care.

The decision to modernize the hospital was driven by several factors, according to George Tingwald, Director of Medical Planning at Stanford Health Care. “The beginnings of the new hospital are rooted in the requirements from the state of California to upgrade buildings seismically,” noted George in a podcast. In addition, the hospital needed to expand its facilities to accommodate the growing population of patients in the region.

However, the hospital also recognized that advances in technology presented an opportunity to make healthcare more efficient. By modernizing its facilities and introducing new technologies, the hospital aimed to provide patients with a more personalized and holistic care experience.

Opened right before the COVID-19 pandemic, the innovations helped the hospital to be better prepared for disruptions—e.g., by increasing the use of virtual care. At the same time, becoming a technology-driven organization brings in new challenges, as well.

Integrated digital systems

To accelerate care, Stanford Hospital has been implementing IT advancements for years, combining clinical practices, research, and education.

Today, an electronic health record (EHR) system is a key component of the hospital’s digital ecosystem. Supplied by Epic, Stanford’s EHR platform enables doctors to access patient medical history in real time, enabling better-informed decisions. Additionally, patient records are updated automatically after each medical activity, removing the burden of manually entering information.

The EHR system was also designed to facilitate communication between healthcare providers, reducing the likelihood of errors, and ensuring that patients receive timely and appropriate care.

“Our care team members check in and check out for their shifts in our electronic health record system. So, we actually know who’s taking care of patients at any given moment in time. What we’re doing is exposing this to the patient, demystifying the experience for patients who may wonder who they’re talking to or who is taking care of them.”

Gary Fritz, Stanford Health Care

Christian Lindmark, VP/CTO at Stanford Health Care, notes that the EHR system is integrated with the infusion pumps in patient’s rooms. The system is programmed to send a medication order to the infusion pump. In response, the infusion pump sends information such as the start and stop time back to the EHR to document the process. This effectively reduces the risk of medication errors, improves the administration of medication, and decreases inconsistencies in nursing documentation.

Most of the EHR information is stored in the vendor’s core NoSQL database called Chronicles. From there, the records are sent to an Oracle-based relational database (Clarity) used for advanced reporting. The hospital takes the information from Oracle and transforms it to an operational enterprise data warehouse (Caboodle) having various subject marts. The purpose of this data flow is to provide clinicians with access to patient information while allowing researchers to use EHR for research purposes.

To help patients access their healthcare information and engage with medical staff, Stanford Health Care delivered the MyHealth app. The mobile app was released in February 2015 and had over 600,000 users as of 2019. The tool enabled visitors to access EHR data, track health status, message a primary physician, schedule appointments, etc.

The MyHealth app in action (image credit)

Patients can fill out all their information on the MyHealth app in advance at home, allowing them to skip the line at the front desk. The app also has the eArrival feature, which helps patients to check in automatically and lets staff know as soon as the patient enters the clinic. While inside Stanford Health Care, patients can then navigate the hospital with the GPS functionality of MyHealth (more on the app).

(Read how a Norway-based healthcare institution automated appointment scheduling by developing a similar app for doctors and patients.)

After visiting the hospital, patients get in-app notifications when their test results are ready, providing access to laboratory data. Additionally, the tool provides patients with a list of medications prescribed by a physician.

MyHealth enables patients to view their allergy and immunization records, which can be updated through the app. Additionally, the app can send alerts to patients to remind them of upcoming immunizations that they may need based on their age and medical history.

When a patient is hospitalized, their loved ones can check on his/her status through the app. Additionally, they can also be notified of transportation needs when the patient is about to be discharged.

The Internet of Things and smart rooms

The hospital uses sensors and devices across its facilities to collect data and provide insights for improving patient outcomes. The telemetry systems are capable of capturing data from various patient-linked devices, such as heart-rate monitors, oxygen saturation monitors, and infusion pumps. This way, IoT tools monitor patient health in real time and predict when a patient may be at risk of developing a complication.

“This is an IoT hospital of the first order.” —Gary Fritz, Stanford Health Care

The hospital also has IoT devices that can track the location of equipment and supplies. Specifically, the new Stanford Hospital utilizes Midmark’s real-time location system (RTLS), which uses wireless signals to determine the location of objects. Multiple anchors placed throughout the hospital then receive signals from other anchors and the RTLS tags. Using the signals, the anchors can then triangulate the location of the tags. To date, the hospital has over 2,200 RTLS sensors, 1,500 Wi-Fi access points, and 12,000 biomedical devices.

RTLS-based badges worn by Stanford Hospital nurses (image credit)

RTLS is also used in nurses’ badges for the analysis of operational efficiency, staff movement patterns, prediction of clinical workflows, etc. This way, the hospital can track the average time spent with patients, as well as evaluate, for instance, the influence of the pandemic restrictions on operations. In addition, the badges have a button enabling nurses and physicians to discreetly call security if they are threatened by a patient.

“Our new nursing units are much larger and this mobile technology allows caregivers to be more efficient and saves them thousands of steps each day. Bar code medication administration can also be completed using the camera on the iPhone, which eliminates some of the clunkier technologies previously used.”

Christian Lindmark, Stanford Health Care

Besides RTLS, the new hospital integrated a radio frequency identification (RFID) system to monitor supplies. RFID readers are installed on supply bins on the hospital floor and in operating rooms. The data gathered by the system is used for inventory demand planning, forecasting, and reporting. The bins automatically generate an order to restock once they reach half capacity, so nurses only need to spend time on quality control. Additionally, RFID readers in operating rooms allow nurses to charge drugs to a patient account without the need to manually transcribe inventory numbers.

Another significant technological advancement introduced during the modernization of Stanford Hospital were smart rooms. They were redesigned to become more patient-centric, featuring automated lighting, temperature control, and entertainment systems, all of which could be controlled from the bedside. Patients in the smart rooms can order food and even schedule visits from a therapy dog. The innovation was introduced to not only improve patient comfort and satisfaction but also reduce the workload of nursing staff.

Patients can adjust lighting and temperature with voice commands (image credit)

Robot-assisted automation

Since 2019, the new Stanford Hospital has been using robots for a variety of tasks, including surgery. For instance, the TUG robots are used for the automated delivery of supplies—such as linens, medications, and lab specimens. They navigate the hospital by using sensors and GPS and can avoid obstacles and people, being able to open doors wirelessly.

Made by Swisslog Healthcare, three other robots occupy the hospital’s pharmacy. Two BoxPickers® act as automated storage for various medications, monitoring inventory in real time and automatically generating orders for daily restocks. The third robot, PillPick™, packages medications for patients into vacuum-sealed containers. PillPick gives out 1,000 doses per hour, which is about the same amount a pharmacist can manually pack in 5 hours.

TUG robots can carry over half a ton of supplies (image credit)

According to Gary Fritz, Chief of Applications at Stanford Health Care, the new technologies are all interconnected and designed to support both patients and hospital staff.

“What we’ve been doing is taking the best of Silicon Valley and incorporating it into our hospital.” —Gary Fritz

The challenges during modernization

The modernization of Stanford Hospital was a massive undertaking that required significant planning, resources, and execution. However, several challenges emerged during the improvement process.

One of the significant issues faced by the hospital was the need to ensure that the facility was in compliance with policies and standards. During a presentation, Christian Lindmark explained that the hospital also had to comply with privacy regulations, including HIPAA, by ensuring that EHRs and other patient information were secured.

“We have a private network within the hospital. Anything that can reside on our wired network, per regulations, can also be on our private Wi-Fi network, as well. We have numerous security protocols in place to ensure that only devices we know connect to that network within our hospital. We also have a list of all the devices that we know that can connect to our network. If there was a rogue device that tried to connect, it can only access the public network and get guest Wi-Fi, but it cannot connect to our secure network.”

Christian Lindmark, Stanford Health Care

Another challenge faced by the hospital was the need to upgrade its technology systems while minimizing the disruption to its operations. This included upgrading EHR, enterprise imaging, laboratory, and patient monitoring systems, as well as implementing a new secure messaging/alert/alarm system. According to Christian, most of the 180 applications used in the hospital now had to go through a complete architecture review. This ensured that the apps are scaled appropriately to meet the increased demands of the new hospital for bandwidth, real-time data, etc. To prepare the infrastructure for innovations, Stanford Hospital laid 21 miles of fiber optic cables throughout the facility. To minimize disruption, the hospital developed a detailed plan for the implementation of new technologies that involved extensive testing, training, and support.

Remote monitoring with IoT devices (image credit)

The implementation of robotics reduced the overall workload of staff within the hospital, but it also created additional challenges. For instance, while the PillPick robot significantly speeds up the process of packing medication, pharmacists still have to check each package. Though the robot knows clearly which medication to pack, there are instances when pills get stuck together or are broken into pieces. The maintenance of the robots also added another challenge. Select members of the staff needed to undergo technical training in order to be able to troubleshoot problems.

The increasing adoption of robots, sensors, devices, and other digital technologies creates new points of vulnerability. For example, hackers could potentially gain access to sensitive patient information or even control the robots to cause harm to patients or staff. Five bugs discovered in the TUG robots’ firmware in April 2022 could allow criminals to access medical records, camera feeds, and devices, as well as lock down elevators and doors. Attackers could also hijack admin sessions in the robots’ online portal and inject malware. Though the vulnerabilities were patched in time, risks like that occur on a regular basis.

A DDoS attack on Stanford Health Care and 16 other hospitals in January 2023 caused a disruption to their public services, providing another example of a possible threat. With the amount of sensitive data at stake and the number of devices, medical institutions must implement strong security measures and protocols.

The results

The modernization of Stanford Hospital has brought about significant benefits to the hospital, its staff, and patients. In November 2020, the hospital received the Gartner Healthcare and Life Sciences Eye on Innovation Award, recognizing the work done. According to a press release by Gartner, the modernization has resulted in an estimated $2 million in cost savings due to the automation of routine tasks and elimination of duplicate documentation. The care team response time has been reduced by 6x—from 12 to 2 minutes.

“Technology has allowed us to be able to meet demand in a new way. It allowed us to see more patients and to see them more frequently.”

 —David Entwistle, Stanford Health Care

Patients can now access a wide range of medical services under one roof, including diagnostic services, surgical procedures, and specialty care. The hospital’s modern equipment and technology have also improved patient outcomes, with reduced complications, infections, and readmissions. Besides, patients can now access their medical records and test results online through the MyHealth app, making it easier to engage with caregivers.

“Family members may be in a chair right next to the patient, or they may be 3,000 miles away. A daughter who lives in Washington, DC, can keep tabs on what’s happening with a parent who may be hospitalized.”

Alpa Vyas, Stanford Health Care

Stanford Health Care’s telemedicine initiative was largely successful during 2020, the height of the pandemic. 360,000 video visits were conducted, with 87% of cases resolved without additional follow-up. In 2021, the number of video appointments stabilized at around 60,000 per month, representing 30–40% of ambulatory visits across over 35 states.

“At Stanford Health Care, having telehealth fully integrated in our EHR workflows and accessible via our digital platform from anywhere on any device has not only increased access to care for patients and their families but has also eased care-team coordination. We saw continued incorporation of AI/ML capabilities across the ecosystem for enterprise productivity, diagnostics and treatments, and are excited to develop this further at Stanford in 2023.”

Michael Pfeffer, CIO, Stanford Health Care

The modernization of the hospital has brought about increased efficiency in hospital operations. The introduction of new technologies, such as the Internet of Things, resulted in reduced waiting times, faster diagnoses, and quicker treatment times. The innovations have also improved employee productivity and reduced the administrative burden, allowing staff to focus on patient care. The $2.1 billion project has also improved patient outcomes with reduced mortality rates, shorter hospital stays, and fewer readmissions.

Further reading


The article is written by Carlo Gutierrez, Alex Khizhniak, and Yaroslav Goortovoi.

The post What Can We Learn from Stanford Hospital’s Use of IoT and Data? first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]> https://www.altoroslabs.com/blog/stanford-hospital-speeds-up-care-with-iot-ai-and-telehealth/feed/ 0 The Challenges of Implementing Data-Driven Personalized Healthcare https://www.altoroslabs.com/blog/the-challenges-of-implementing-data-driven-personalized-healthcare/?utm_source=rss&utm_medium=rss&utm_campaign=the-challenges-of-implementing-data-driven-personalized-healthcare https://www.altoroslabs.com/blog/the-challenges-of-implementing-data-driven-personalized-healthcare/#respond Wed, 01 Mar 2023 05:34:20 +0000 https://www.altoroslabs.com/blog/?p=462 While innovation and EHRs enable more patient-centric care, ensuring interoperability and providing in-home care can be tricky.

The post The Challenges of Implementing Data-Driven Personalized Healthcare first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
(Featured image credit: EY)

While innovation and EHRs enable more patient-centric care, ensuring interoperability and providing in-home care can be tricky.

The cost of an error in healthcare

The most recent findings by the AHRQ (December 2022) show that 7.4 million patients are misdiagnosed yearly in the US alone. As a result, 2.6 million suffer an adverse event, and around 370,000 patients experience serious damage rooting from a diagnostic error.

Furthermore, the WHO highlights that one death in every million worldwide is caused by medication errors. In the EU, this would translate to 163,000 deaths per year. The costs related to medication errors total up to €41.4 billion annually, excluding patients’ lost wages and productivity.

The report by Philips echoes these findings, as the costs of hospital readmissions due to post-discharge oversight amount to $17 billion in the US alone. The paper highlights that 1 in 3 adults worldwide has multiple chronic conditions and 75% of US health consumers expect their care experience to be more personalized.

It’s no surprise a survey by McKinsey discovered a growing dissatisfaction toward the inability of health/insurance providers to address individual conditions. Unplanned high-cost follow-up care increases the likelihood of changing a health insurance plan by 2.2x, the survey revealed.

While healthcare institutions are trying to solve the issues in a global turmoil happening since 2020, it’s easier said than done. According to a report commissioned by the ECAMET Alliance (February 2022), 25% of hospitals are unable to estimate the number of medication errors registered annually. The stages where errors commonly occur include administrative routine (29%), electronic prescribing (21%), manual prescribing (17%), medicine dispensing (17%), and medicine preparation (16%).

The rate of medication errors registered per year (source: ECAMET/Ipsos)

Aiming to provide customer-centric care, providers adopt various innovative technologies and concepts, such as telehealth, wearables, AI, remote patient monitoring, etc. With the growing amount of data in healthcare, EY notes that this presents an opportunity to improve outcomes across diagnosing, treatment, discharge planning, and follow-up care.

Turning clinical data into customized treatments, according to EY, may involve multiple phases. This pathway can start by assessing patient conditions with medical devices, and then this information can be combined with other historical data stored in EHRs. Combining this information, physicians can personalize treatment plans and then continuously optimize care, extending it beyond medical facilities to homes.

Reports by McKinsey and other analysts suggest a similar approach.

An example of how virtual and in-home care drive personalization (source: McKinsey)

This article reviews the details of personalized healthcare implementation and highlights the associated challenges.

A full view of a patient’s medical history

Analyzing historical patient data is key to accurate diagnostics. This should involve the current and previous information about all diseases and conditions, invasive procedures and surgeries, allergies, genetic predispositions, wellness habits, etc. For this purpose, the role of electronic health records (EHRs) cannot be overestimated.

“The electronic health record must transition from an emphasis on a person’s medical record to an emphasis on a person’s plan for health.” 

John Glaser, Harvard Medical School

A recent report by Deloitte (October 2022) outlined the potential that EHRs have in granting patients more control over care and fostering a proactive approach to health. The study discovered that healthcare providers are willing to incorporate the latest technologies into EHR systems to ensure more precise diagnoses, automation, assistance, etc. It also revealed that EHRs and patient portals are still “not intuitive enough, and improvements are needed for both clinicians and consumers.” 

Most wanted technologies for enhancing EHR systems (source: Deloitte)

A 2021 report by Philips provided similar insights.

“Cloud-based digital platforms, IoT, and AI will ultimately enable an ecosystem of connected medical and personal care devices from multiple vendors, which can collect, analyze and exchange data to help consumers, patients, care providers, and payers make timely and better informed decisions.”

Henk van Houten, CTO, Philips

At the same time, both practitioners and patients want to include more types of data into EHRs (such as those from wearables and the Internet of Things devices).

Enabling post-acute and in-home care

According to the CDC, in the US alone, 60% of adults have a chronic disease, and 40% have two or more. Ongoing monitoring of related health parameters can help to timely respond to accidents, such as high blood pressure or abnormal glucose levels. Potentially, this can prevent events like a cardiac arrest.

That’s one of the reasons why 47% of wearable owners willingly share their devices’ data with healthcare providers (Deloitte, December 2022). This enables practitioners to further personalize treatment, enabling care beyond hospitals and medicine centers. (E.g., one of our customers developed an app sending oximeter/spirometer data to a doctor prior to an appointment to cut waiting time and promote a more personalized approach.)

Consumers’ attitude toward sharing health data (source: Deloitte)

McKinsey estimates that up to $265 billion worth of services (almost 25% of the total cost of care) for Medicare FFS and MA beneficiaries could shift from traditional facilities to the home by 2025.

In addition to telehealth consultations and remote patient monitoring, doctors and patients can utilize software for medication reminders, daily checkups, etc. Some of the applications and devices may have alerting functionality, ensuring emergency assistance in the most critical cases.

For instance, Polish startup SiDLY developed a smart wristband for seniors, which has sensors for constantly monitoring essential health parameters and fall detection. Having a SOS button, a GPS tracker, and functionality for voice communication, the device enables caregivers to timely handle emergencies beyond medical facilities. The rescue team of the telemonitoring center operating 24/7 has already helped to save more than 2,300 seniors in 2022 alone. The product has been adopted by local governments in several European countries, including implementations in the Adriatic–Ionian region.

Emergency services integrated with wearable devices (source: SiDLY)

Interoperability and integration

The Office of the National Coordinator for Health IT (ONC) sees sharing patient information as a way to timely avoid medication errors and readmissions, reduce duplicate testing, etc. However, the study by Microsoft highlights that patient records are often distributed across legacy and fragmented systems that do not interact. 72% of healthcare executives consider it as a major barrier. A survey by Philips supports these findings, as 51% of healthcare leaders named data silos a real challenge, and 19% of the execs indicated the difficulties of obtaining information.

A study by ECAMET Alliance notes interoperability gaps across medical institutions—in ambulatory care, only 52% of EHRs are available to the patients. 25% of patients have only partial access to their records, and 24% have none.

The access to EHRs is limited (source: ECAMET/Ipsos)

The scale of interoperability issues intensifies as EHRs need to be integrated into a bigger ecosystem of healthcare tooling. When it comes to medication history, only 55% of central pharmacies have their electronic prescribing systems integrated with EHRs.

Though access to data encapsulated in EHRs can be facilitated via APIs, a variety of regulations and policies should be considered, such as HIPAA, HITECH, etc. For instance, CMS outlines a number of APIs to implement, while ONC lists data privacy and security considerations in this regard. According to The Future Health Index 2022 report by Philips, 21% of leaders say that “data policy and/or regulations impede their ability to use data to its full potential.”

Finally, forced to grant access for patients to individual EHR data, healthcare institutions will need to spend budgets on developing patient portals. This can be illustrated by the story of a patient who wanted to switch insurance providers to get a better care plan. What she found out was that the providers couldn’t share her full medical record right away, because there was no system to do so. Instead, she was offered to pay $723.45 for 4,823 pages of her printed medical history. As a result, the solution she had to resort to were three DVDs and an external hard drive for reading the information from the computer.

Data reliability, vulnerabilities, and the human factor

According to Philips, only 69% of healthcare leaders are sure that the data they operate with is reliable and accurate. What are the reasons behind unreliable or inaccurate records when trying to enable a more personalized and a technology-driven approach? The list is by no means exhaustive:

  • EHRs may lack some of the health-critical information.
  • Patients may use wearable devices improperly or not on a regular basis.
  • Sensors may be broken, failing to collect or send the data.
  • AI algorithms used for diagnostics may have their own limitations.

At the same time, Bluetooth-enabled devices suffer from security vulnerabilities, which can be exploited by hackers to access key features. Taking into account that some devices may simply be noncompliant, all this introduces bottlenecks to reliability and availability of data.

Key vulnerabilities across a wearable device data path (source: HHS)

On top of this, prescribed medication can include temperature-sensitive drugs. Failing to ensure proper storage at home or transportation conditions may result in medication not working at best or causing harm at worst. The same applies to laboratory samples (like blood) delivered from the home. Still, a healthcare provider not only needs to be sure that samples and medicine are OK, but also wants to have trust in data confirming this.

Besides, the issues related to implementation of personalized healthcare are not only on the technology side. In the US, 50% of patients do not follow their long-term medication plans for chronic conditions. Meanwhile, nonadherence leads to $500 billion in avoidable healthcare costs, almost 125,000 preventable deaths, and up to 25% of hospitalizations. This may create additional problems with in-home treatment, decreasing the value of a personalized health plan.

Increasing patient engagement

Studies demonstrate that personalized notifications via mobile apps foster adherence. For example, a patient’s failure to refill a prescription may trigger a reminder to reach the closest pharmacy. At the same time, doctors could also receive notification in case a patient fails to follow the treatment or an emergency occurs.

In its turn, the need for tracking the temperature of medicine and samples used for in-home care can be addressed with IoT solutions. A US-based healthcare startup collaborated with us to deliver a blood-sampling kit to enable patients to collect materials for analysis at home. The system allowed for tracking the kit’s status on its way from private locations to the clinic, ensuring the readings are within a specified temperature range. In addition, app notifications helped patients to take blood samples on a regular basis. To further increase the confidence in the data, a solution like that could be implemented on a blockchain, which has an immutable nature.

So, organizations willing to embrace personalization should be prepared to update their technology stack with innovative IoT solutions, data analytics, AI, telemedicine tools, and cloud systems. Adopting IoT devices may ensure remote and continuous visibility into the health status of patients, helping physicians to fine-tune therapies in real time. Analytics software can turn data from IoT sensors, EHR systems, or other sources into value, providing insights based on patient conditions and unique traits.

Customizing patient experience has proven to benefit both healthcare service providers and recipients. BCG reported that some organizations adopting healthcare personalization improved quality standards by 20–25%, while reducing administrative costs by 5–10% within 6–12 months.

An example of a compliant healthcare data management architecture (source: Microsoft)

However, the growing reliance on data must come with proper measures to ensure that patient information is efficiently combined, shared, and protected. Compliant data management platforms—along with solid security efforts, patient engagement, and assistance from a trusted partner—will help to achieve this goal.

Frequently asked questions (FAQ)

  • How can nonoptimal treatment influence patients?

    According to a December 2022 study by the Agency for Healthcare Research and Quality, 7.4 million patients are misdiagnosed in the US annually. This leads to adverse events for 2.6 million individuals and severe damage to ~370,000 patients.

    On top of that, the WHO reports that medication errors cause one fatality per every million people worldwide. In the EU, this results in 163,000 deaths each year and costs up to €41.4 billion annually, which does not include the lost wages and productivity of patients. Philips confirms these figures, indicating that the hospital readmission expenses due to post-discharge oversight account for $17 billion in the US alone.

  • Why is interoperability important for patient-centric care?

    The Office of the National Coordinator for Health IT (ONC) promotes exchanging patient information to achieve more accurate diagnoses, prevent medication errors and readmissions, eliminate redundant testing, and so on. Still, according to Microsoft, medical records are often stored in legacy or fragmented systems that do not interact with each other. As a result, 72% of healthcare leaders consider it as an obstacle in their practice.

    A report by Philips confirms these insights, as 51% of executives surveyed called data silos a real challenge, and 19% of the leaders noted the complexities of acquiring information.

  • What are the common challenges of implementing personalized healthcare?

    Interoperability is one of the main issues across medical institutions, according to a 2022 study by ECAMET Alliance. E.g., just 52% of EHRs are available to the patients in ambulatory care, while 25% of customers have partial access to their medical records, and 24% have none.

    In its The Future Health Index 2022 report, Philips revealed that only 69% of healthcare execs are sure their data is reliable and accurate. Besides, 21% of leaders consider policies and regulations an obstacle to making the most out of data.

    Healthcare institutions will need to spend some budgets on developing patient portals, integrations, data analysis, etc., as well as provide training to the personnel. Still, to achieve reliable outcomes, proper patient engagement, including adherence and sharing data from wearables, is also needed.

Further reading


The article is written by Alex Khizhniak, Sophia Turol, and Andrea Di Stefano.

The post The Challenges of Implementing Data-Driven Personalized Healthcare first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
https://www.altoroslabs.com/blog/the-challenges-of-implementing-data-driven-personalized-healthcare/feed/ 0
Barriers and Guidelines to Adopting Wearables in Sports https://www.altoroslabs.com/blog/barriers-and-guidelines-to-adopting-wearables-in-sports/?utm_source=rss&utm_medium=rss&utm_campaign=barriers-and-guidelines-to-adopting-wearables-in-sports https://www.altoroslabs.com/blog/barriers-and-guidelines-to-adopting-wearables-in-sports/#respond Wed, 21 Dec 2022 14:04:21 +0000 https://www.altoroslabs.com/blog/?p=330 Athletes can rely on sensors to assess performance, but device adoption comes with technical and compliance challenges.

The post Barriers and Guidelines to Adopting Wearables in Sports first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
Athletes can rely on sensors to assess performance, but device adoption comes with technical and compliance challenges.

The sports wearable landscape

With the speed of a sprinter and the strength of a boxer, innovations are impacting every facet of the sports business. According to a survey by PwC‌, 69.7% of the respondents mentioned that the technologies augmenting physical activities represent a key market force in the industry. Furthermore, innovations positively impact both the athlete and fan experience.

One of the major tech trends in this regard, however, is the rising adoption of sports wearables in combination with health or fitness apps, as reported by McKinsey (2021). As the name suggests, these devices can be worn by amateurs or professional players to track relevant metrics, facilitating training and ensuring athlete welfare.

A salivary sensor for athletes’ biochemical monitoring (source: Nature)

Studies prove the superiority of a data-driven approach to athlete training unlocked by sports wearables compared to traditional coaching. In particular, the National Center for Biotechnology Information reported that wearable-based vibratory feedback helped to increase the swimming stroke rate. The wearable technology in sports can also facilitate the assessment of athletes’ psychological state. For instance, the universities of Lapland and Istanbul delivered a machine learning–based system, helping coaches to assess the stress level of players with 85% accuracy.

As you can understand from the example above, sports wearables are just the tip of the iceberg. We may see them as the eyes and ears of complex systems that encompass multiple technologies, such as the Internet of Things (IoT) and machine learning (ML). Deloitte describes a multilayered architecture of such platforms as follows:

  • Sensors integrated into athletes’ clothing and accessories to collect biometric data
  • A network layer to transmit this data via gateways and communication protocols
  • An integration layer to aggregate and store information, including an IoT middleware and a data storage
  • A data analytics module to process information and turn it into actionable insight
  • Dashboards to visualize the insights in an intuitive format

An example of a dashboard connected to sports wearables (source: Catapult)

Examples of the wearable technology in sports

Sports wearables can take many forms: smart watches and swim goggles, GPS-tracking vests, connected footwear, etc. These devices are equipped with a vast range of different sensors, such as inertial measurement units and biomedical gadgets to analyze motion, temperature, heart/respiratory rate, muscle contraction, and more.

According to Deloitte, such gear is commonly deployed to facilitate athlete development, player safety, and fan engagement.

  • Coaches can track athlete performance to set personalized training paths and adjust in-game strategies (including player selection during matches and recovery times). For instance, CNN reported that many professional players in the English Premier League have started wearing GPS vests from Catapult Sports to optimize their workload and performance. The equipment monitors several metrics—such as power score, sprint distance, and top speed.
  • Physicians utilize the physiological and biochemical information provided by sports wearables to make informed decisions. This helps to minimize the risk of injuries and speed up player recovery with suitable therapies. According to ABC News, the NBA is turning to smart wearable fabrics by Nextiles for injury diagnosis and prevention. The devices can track a multitude of parameters—including jump height, impulse and power of the jump, symmetry of the legs, ankle angle, etc.
  • The wearable technology in professional sports also means entertainment. Real-time analysis of sports data provides fans with additional statistics and live insights to enhance their experience.

In-game metric tracking based on sports wearables (source: Catapult)

Adoption challenges of the wearable technology in sports

Data is the real fuel of every technology sports wearables rely on, such as the IoT and machine learning. However, this can be both an advantage and a disadvantage. The most common challenges related to smart device adoption in the sports industry reflect this dualism.

Integration and processing complexities

Imagine a football trainer in a live match, coordinating a group of players that don’t speak the same language. The coach needs to provide feedback to the athletes based on their roles (goalkeeper, striker, etc.). As highlighted by Amazon, sports wearables and related analytics software face similar issues:

  • multiple types of data (heart rate, muscle contraction, etc.)
  • different network protocols, standards, or technologies (such as Wi-Fi and Bluetooth) used to communicate with system components
  • streaming data in real time

Fortunately, there’s a vast range of platforms and services on the market to help perform the above processes. For instance, Amazon Data Lake or Azure IoT Central.

An architecture to store data from wearables in a data lake (source: Amazon)

According to Microsoft, time-series databases help to capture and process data in real time. Using time stamps, these databases organize information in a chronological order. This includes data from wearables for wellness monitoring (such as cardiovascular screening). So, it is possible to compare current (e.g., live electrocardiogram feeds) and historical data to detect anomalies, generate real-time alerts, perform predictive modeling, visualize trends, etc.

Another option to manage real-time data are NoSQL databases. One of the pros is a possibility to organize heterogenous data in flexible data structures that can be easily updated. NoSQL databases also come with scalability and performance to handle immense volumes of information without downtime.

In case training takes place in different locations, data consolidation in a cloud provides coaches with a single point of access to all the necessary information. For example, a Norwegian provider of sports-oriented hardware and software partnered with us to build a cloud platform that monitors and analyzes athlete physical performance. The system gathers and consolidates information from 13 types of smart devices distributed across multiple training locations. Based on this historical data, it is possible to analyze progress over time and develop data-driven training strategies.

Accuracy and false positives

When it comes to tracking sports performance, accuracy might not be a matter of life and death. However, when wearables are used to monitor biometric data and ensure athlete welfare, a high rate of false positives and negatives can lead to misdiagnoses. In this regard, research from the American College of Cardiology suggests that wearables have shown promise for cardiovascular disease screening of athletes. On the other hand, the same study highlights that similar tools are still far from perfect.

Smartwatch-based vs. patch-based cardiac monitoring (source: Frontiers)

Detecting anomalies (such as arrhythmia) that may be signs of health issues is an ideal scenario for ML systems. Algorithms should be trained with clinical data and learn to recognize recurring patterns or outliers. Microsoft recommends utilizing principal component analysis (PCA) as one of the approaches. This technique focuses on identifying the most common data points, namely what we would consider “normal,” based on a range of relevant features. Then, it uses distance metrics to spot outliers that deviate from these feature values and, therefore, represent anomalies.

In its turn, IBM suggests setting a threshold, which defines the minimum values, to consider deviations as actual anomalies and trigger a response. The trade-off, though, between true positives, false positives, and false negatives should be carefully weighed. Otherwise, the system may simply end up ignoring anomalous conditions.

Battery life

Ongoing monitoring of an athlete’s physical condition during training is an essential requirement of any wearable. While, in manufacturing, devices on stationary machinery can be easily recharged, sports requires a compromise between durability and portability.

As pointed out by Garmin, one of the leaders in device manufacturing, several features can impact battery life—including GPS tracking, establishing connection, and widgets. Some of these (such as Wi-Fi and Bluetooth) may be redundant and worth switching off to minimize battery drain, but that’s not always possible depending on the activity.

A viable option to mitigate such cons is to reduce the energy consumption of wearable devices via low-power communication techniques and protocols. Microsoft suggests relying on the Bluetooth Low Energy protocol, LPWAN, and LoRaWAN, which are optimized for low-power consumption.

Security and regulatory compliance

The growing sensitivity toward data protection has led to the proliferation of standards and regulations, including the GDPR and HIPAA. Legislation on this matter may limit the use of data in sports, especially when wearables are integrated into electronic health records, as Deloitte pointed out (October 2022).

This tension might be further amplified due to the exploitation of athlete information by private organizations. For example, the technology company Genius Sports has signed an agreement with the Mid-American Conference, acquiring the rights to manage sports data and sell it to betting companies. Some of the athletes may not be fine with that.

Finding a trade-off between algorithms’ infamous data hunger and protection will be a matter of study for the years to come. Meanwhile, sports companies should adopt solutions built in full compliance with the data management legislation applicable to the industry. In the European Union, the GDPR defines requirements around collecting, storing, and processing personal data. For example, it established the minimization principle, which limits the use of data based on relevance and necessity. Companies should implement policies to disclose how personal information is gathered and processed (including the storage of cookies). At the same time, users should be able to provide or withdraw their explicit consent and apply their right to be forgotten.

Furthermore, any data analytics platform and related devices must be equipped with security features against data leaks or cyberthreats. After all, any attack against a vulnerable point may disrupt the network as a whole, due to the interconnected nature of wearable devices in sports. To address that, Amazon lists measures such as security keys for data encryption and access to resources based on the least privilege principle. In its turn, Deloitte recommends implementing security information and event management for analytics and reporting, intrusion prevention and detection, risk identification, etc. You can also rely on cloud-based tools for security governance, such as AWS Control Tower or Azure Active Directory.

A technology-driven, more accessible play

Despite the advantages and disadvantages of the wearable technologies, they appear to be influencing, if not completely reshaping, the sports industry. This is evident from device usage to improve performance results of individual athletes to developing strategies for the whole team.

For instance, a tech company Mojo Vision partnered with Adidas and some other fitness brands to integrate proprietary data-tracking eye lens into consumer products. The technology allows to overlay fitness performance data and graphics powered by augmented reality without obstructing vision. This can be useful for athletes and amateurs preparing for marathons. In this regard, Adidas is contemplating adding such a feature to its Runtastic apps.

Another wearable-related trend to keep an eye on is inclusivity. For example, CNN revealed that French paratriathlete Charles-Edouard Catherinee relied on the vibratory feedback from a GPS device by Wayband to run independently. Wider access to sports may be a solid counterargument in the current debates over the use of technology in this industry.

That said, research is ongoing to ensure that wearables level up in helping athletes to monitor their physical condition, prevent injuries, improve performance, and reach new heights.


This blog post was written by Andrea Di Stefano, edited by Sophia Turol and Alex Khizhniak.

The post Barriers and Guidelines to Adopting Wearables in Sports first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
https://www.altoroslabs.com/blog/barriers-and-guidelines-to-adopting-wearables-in-sports/feed/ 0
The Ins and Outs of the IoT for Transportation and Logistics https://www.altoroslabs.com/blog/iot-in-transportation-and-logistics-use-cases-challenges-and-best-practices/?utm_source=rss&utm_medium=rss&utm_campaign=iot-in-transportation-and-logistics-use-cases-challenges-and-best-practices https://www.altoroslabs.com/blog/iot-in-transportation-and-logistics-use-cases-challenges-and-best-practices/#respond Tue, 13 Dec 2022 14:23:09 +0000 https://www.altoroslabs.com/blog/?p=202 While optimizing fleet management, routes, asset tracking, and driver engagement, the Internet of Things requires security, connectivity, budgets, and expertise.

The post The Ins and Outs of the IoT for Transportation and Logistics first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
(Featured image credit)

While optimizing fleet management, routes, asset tracking, and driver engagement, the Internet of Things requires security, connectivity, budgets, and expertise.

The state of the IoT in transportation

Today, transportation and logistics are making the world interconnected and “smaller.” Over the last two decades, the Internet of Things (IoT) has further amplified this effect, providing the industry with networks of devices to meet both operational and analytical requirements. These include accurate insights into the supply chain, automation, cost efficiency, and enhanced decision-making, according to a 2021 report by Inmarsat, a global satellite communications provider. Besides, 54% of respondents considered the IoT important to address the supply chain issues that arose during the COVID-19 pandemic.

These advantages are likely to further catalyze the adoption of the IoT in the transportation industry in the near future. Ericsson expects the number of cellular IoT connections in transportation to grow from 100 million in 2020 to 292 million in 2030.

Adoption drivers for implementing the IoT (source: Inmarsat)

The Internet of Things in transportation relies on sensors that collect data from moving assets, be it vehicles or goods, and from related infrastructures (such as terminals and warehouses). This information is then processed by data analytics solutions and used to harmonize the supply chain (route optimization, load capacity, maintenance, driver safety, etc.).

Despite the advantages of IoT in logistics and supply chain, several companies get stuck in what McKinsey defines as a “pilot purgatory.” This article sheds some light on the barriers that occur during IoT adoption in transportation and explains why companies still invest in innovations like this despite the obstacles.

Common scenarios

McKinsey, Deloitte, and Verizon observe the following most common scenarios for adopting the IoT in transportation.

1. Fleet management

Coordinating vast fleets of vehicles requires huge organizational efforts. IoT systems help to streamline such workflows in several ways. Transportation providers can supervise demand and fleet availability to optimize vehicle allocation. They can also adapt routes based on location, traffic, weather, or road conditions, as well as send updates (including estimated arrival time) to drivers and partners via integrated communication systems.

For example, DHL implemented an IoT-based fleet management system named SmarTrucking to improve fleet scheduling and shipment visibility. According to the company, the solution may reduce transit times by up to 50%.

2. Tracking assets and goods

This use case of the IoT in logistics can involve both vehicles and the goods they transport, such as pallets and parcels. Asset tracking represents a key element to efficient management of the supply chain and workflows in warehouses, yards, and terminals. This includes operations such as outbound order shipping, loading in docking stations, truck queuing at ports, etc.

In this regard, Airbus implemented a GPS-based IoT system to track 15,000 connected packages, minimize asset misplacements, and send timely notifications in case of deviations. Another example would be a Norwegian software provider for transportation that partnered with us to automate order planning. The developed platform enabled each of 100+ logistics companies using it to facilitate the delivery of 1,000 orders on a daily basis.

An example of an asset tracking system architecture (source: Microsoft)

3. Condition monitoring and maintenance

Combining IoT sensors and ML-based software, it is possible to collect vehicle health data, such as fuel consumption, tire pressure, and engine performance. This data is analyzed via ML algorithms to identify any deviations from standard conditions and predict future failures. Once the system detects an anomaly, it can send an alert and request maintenance.

A mining corporation Rio Tinto, for example, adopted a pilot IoT platform based on Azure to monitor trucks and other equipment. The initiative facilitated asset maintenance and streamlined the supply chain.

4. Driving optimization

An IoT platform can keep track of shifts and resting times to facilitate compliance with labor regulations. It is also possible to monitor and analyze driving behavior to offer insights that will help improve safety while reducing fuel consumption.

A similar solution for truck driver assistance and trailer monitoring has been developed by ZF Group, a German manufacturer of automotive components. The software aims at promoting safer and eco-friendly driving behavior, as well as cutting 6% of expenses by a mid-size truck transport company.

One step forward toward the future of the IoT in logistics will be a large-scale adoption of autonomous vehicles, where drivers would have a merely supervisory role. However, similar solutions are still not in commercial use, as highlighted by McKinsey.

5. Cold-chain delivery

Transporting temperature-sensitive products, such as medicine, vaccines, or food, can be particularly complex both in operational and regulatory terms. Cold-chain delivery is aimed at addressing this challenge, and the Internet of Things in transportation can be one of its most valuable tools. Sensors are commonly used to analyze and ensure optimal cargo conditions, therefore, maintaining product quality and minimizing losses.

For instance, an air cargo carrier partnered with us to develop an IoT system that can monitor temperature and humidity at up to 10,000 aircrafts, transporting pharmaceuticals. The testbed for the project comprised 3,000 sensors with a possibility to scale on demand.

What are the technical challenges?

Issue #1. Integration

Integrating multiple components that comprise an IoT system can be rather complex. In logistics, IoT systems rely on thousands of sensors mounted on transportation means and assets to gather all sorts of data: location, speed, temperature, humidity, etc. To deliver meaningful insights and analytics, all this data has to be gathered and processed in real time. The report by Inmarsat reveals that integration in one of the top challenges for organizations during both deployment phase and once deployed.

IoT adoption barriers in transportation and logistics (source: Inmarsat)

The thing is that sensors may come from different manufacturers and employ different networking protocols under the hood. So, in addition to integrating sheer volumes of real-time data, there is an issue with the compatibility of underlying protocols and data formats they use. The need to store this data for analysis and reporting, as well as synchronize it with corporate systems only amplifies the problem.

For instance, one of our customers needed to gather data from 5,300 devices in real time—for its fleet management platform.

When it comes to big data in the IoT, Amazon suggests pairing traditional relational databases with NoSQL solutions. The latter store data distributedly as objects (documents or a key-value pair), which promotes performance, consistency, and flexibility. There are numerous options out there, including Couchbase Server, MongoDB, Cassandra, etc.

At the architecture level, serverless solutions bring quite a few perks to the table. In essence, the concept implies that all the trouble around allocating and managing computing resources is on the cloud provider’s side. According to DataStax, a serveless approach and IoT systems are a perfect match if your priorities are performance under high loads, availability during spikes, and scalability.

Moving onto the protocol compatibility, it’s worth considering the implementation of an enterprise service bus (ESB). There also exist specific tools to tackle the issue of heterogeneous protocols and data—e.g., AWS IoT Greengrass.

Issue #2. Connectivity

While the IoT in manufacturing typically deals with stationary assets, the same can’t be said about transportation and logistics. This means that a truck may enter cellular network blind spots, such as tunnels or rural areas, resulting in high latency or lost connection between sensors and an IoT platform.

To address this disadvantage, HiveMQ recommends relying on a publish–subscribe pattern with IoT sensors loosely coupled to other platform components. When a connection is available, IoT devices deliver and receive data in real time. Otherwise, these devices memorize historical data with time stamps assigned to it and create an offline message queue. As soon as the connection is restored, all buffered information will be sent to the cloud, preventing any loss of messages. Such data points with associated time stamps are typically stored in time-series databases, as they’re optimized to handle this type of information.

When it comes to latency in transportation scenarios, another solution suggested by McKinsey is edge computing. The idea is to move part of the storage and computing resources from traditional data centers to the devices, where data is actually generated (vehicles or terminals). This minimizes the distance between data sources (IoT devices) and the processing power, reducing dependency on the network.

An example of an IoT architecture for transportation (source: Deloitte)

Issue #3. Security

Wide networks of interconnected IoT devices can suffer from multiple points of vulnerability. This makes cybersecurity a top-tier parameter when building or choosing an IoT solution, as pointed out by McKinsey. Among the features to protect your IoT platform from data breaches and hacks, Amazon mentioned access management, device authentication and authorization, data encryption, event management, etc.

Further protection can be achieved by relying on cryptographic protocols, including Transport Layer Security (TLS), or technologies such as blockchain. The latter was mentioned by Deloitte as one of the innovations set to enhance the role of IoT in logistics. Specifically, blockchain-based architectures and algorithms make sure that information collected through IoT devices is not altered after being stored. In practical terms, this prevents carriers from modifying any relevant documentation, such as waybills.

An example of an IoT architecture with TLS (source: GSMA)

Issue #4. Battery durability

Batteries run out, and those powering IoT sensors are no exception. The European Commission reported that most devices have a 10-years operational life, while their batteries can last less than 2 years. Research in this field is still ongoing, focusing on the creation of batteries that can recharge themselves through heat, vibration, or lights (for example, via microsolar cells).

In the meantime, what transport providers can do is minimize consumption. Low-power protocols and technologies, such as LoRaWAN, can help in this regard. As for short-range connectivity, Inmarsat’s report mentioned Bluetooth Low Energy (BLE). Due to its intrinsic nature, however, BLE can be used just for specific logistics operations rather than long-range monitoring.

Using LoRaWAN gateways in the IoT architecture (source: Microsoft)

Issue #5. Load spikes

Holidays and rush hours can put a strain on logistics, as much as on IoT systems used in this industry, due to the consequent spikes in network traffic. A first solution to address this issue is proper network utilization via bandwidth-efficient and lightweight protocols, such as MQTT. In this regard, a test by HiveMQ revealed that sending 100 messages via HTTP required 554,600 bytes, while the same operation with MQTT took only 44,748 bytes.

Another option suggested by HiveMQ is to build a masterless cluster architecture. Basically, networks of cluster nodes (i.e., processing or storage units) without shared resources serve thousands of IoT devices. Depending on the network load, the amount of nodes can be scaled up or down. The devices connected to a certain node can simply switch to another one and resume the current session.

Issue #6. Device maintenance

An IoT sensor can monitor the status of assets and goods. But who monitors the sensor itself? In addition, how to upgrade software on the edge (e.g., install security patches)? Well, these tasks can be challenging, both for the incredible amount of devices involved and for their geographic distribution across vast logistics ecosystems.

To ensure remote, ongoing supervision over each sensor, device management software may help (such as this one by Amazon). This can enable to:

  • update firmware on devices remotely
  • detect anomalies
  • reboot sensors
  • configure automatic alerts, etc.

These are only the major technical challenges. Other issues, such as the lack of expertise or limited budgets are also specified by the companies.

Optimized fuel costs and ROI

The IoT has proven extremely valuable for transportation and fleet management. A report by Verizon published in November 2022 discovered that GPS fleet tracking helped respondents to decrease fuel costs by 9% on average. GPS adoption also led to a 17% reduction in expenses associated with vehicle accidents and to a 12% decrease in labor costs. On top of this, 31% of companies reported a positive ROI within 6 months of adopting fleet tracking. As for asset monitoring implementations, 55% saw a positive ROI in less than a year.

The results observed after adopting GPS-based fleet tracking (source: Verizon)

On the other hand, the Internet of Things creates new risks and points of vulnerability affecting other supply chain systems. However, a focus on cybersecurity, proper integration, and reliable connectivity can minimize such issues, making sure that the IoT is an asset and not a problem.

Further reading


This article was written by Andrea Di Stefano, Alex Khizhniak, and Sophia Turol.

The post The Ins and Outs of the IoT for Transportation and Logistics first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]> https://www.altoroslabs.com/blog/iot-in-transportation-and-logistics-use-cases-challenges-and-best-practices/feed/ 0 Scaling Industrial IoT in Manufacturing: Challenges and Guidelines https://www.altoroslabs.com/blog/scaling-industrial-iot-in-manufacturing-challenges-and-guidelines/?utm_source=rss&utm_medium=rss&utm_campaign=scaling-industrial-iot-in-manufacturing-challenges-and-guidelines https://www.altoroslabs.com/blog/scaling-industrial-iot-in-manufacturing-challenges-and-guidelines/#respond Wed, 30 Nov 2022 17:25:11 +0000 https://www.altoroslabs.com/blog/?p=110 Explore IIoT’s essentials and use cases, along with the best practices to address the most common adoption and scaling challenges.

The post Scaling Industrial IoT in Manufacturing: Challenges and Guidelines first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
Explore IIoT’s essentials and use cases, along with the best practices to address the most common adoption and scaling challenges.

The Industrial IoT landscape

Over the last century, the factory archetype portrayed by Charlie Chaplin’s “Modern Times” as an unsafe, dehumanizing workplace is turning into a smart, automated, and interconnected ecosystem.

Among the drivers of this shift, especially in the past 20 years, we may mention the growing reliance on the Industrial Internet of Things (IIoT) for data-driven decision-making, real-time process supervision, and workflow optimization. A 2021 study by IBM mentioned IIoT as one of the four key technologies to help manufacturers in their digital transformation journey.

The most important technology initiatives for manufacturers (source: IBM)

According to McKinsey, the global IIoT market will grow from $290 billion in 2020 to $500 billion in 2025. On the other hand, another report by McKinsey highlighted that 70% of manufacturers surveyed could not scale IoT beyond pilots. Furthermore, a 2022 study by Bain & Company found that four-fifths of companies are scaling fewer than 60% of IIoT proofs of concept.

What makes the Industrial Internet of Things in manufacturing so beneficial, but also complex to implement and scale? This article examines IoT’s major scenarios, pros, and adoption barriers in manufacturing, along with some best practices and guidelines to follow.

How does IIoT help in manufacturing?

The Industrial Internet of Things involves the use of smart sensors and devices to collect operational data from machinery, power systems, or other assets. This data can then be turned into actionable insights via data analytics software. For example, manufacturers can rely on this information to monitor industrial equipment performance, identify process bottlenecks, and predict machinery failures. As a result, this helps to define suitable strategies and initiatives to improve safety, productivity, and efficiency.

As pointed out by Microsoft, Industrial IoT systems are often reliant on cloud technologies and built on a multilayered architecture including:

  • A network of IoT sensors gathering data from machines and routing it to a cloud through a gateway device
  • A data analytics system, which may be powered by a machine learning (ML) engine to provide insights or trigger certain actions
  • Visualization modules on tablets, screens, smart glasses, or mobile devices—to present insights in an intuitive format (dashboards, schemes, views, etc.)

An example of a cloud-based IIoT architecture (source: Microsoft)

Top 5 IIoT scenarios in manufacturing

At the same time, McKinsey names use case identification as one of the main challenges in implementing and scaling Industrial IoT (February 2021). Therefore, it’s worth looking into some of the best options to take advantage of this technology.

1. Operations optimization

According to Microsoft’s report (August 2022), the primary goals for deploying Industrial IoT are associated with operational improvements. In this regard, 86% of manufacturers consider equipment effectiveness the most important criterion—along with optimization of production output, quality, or timely delivery.

The goals of manufacturers implementing IIoT systems (source: Microsoft)

McKinsey, too, describes the optimization of industrial operations as the use case with the highest potential economic value. To achieve that, companies adopt Industrial IoT to monitor machinery, analyzing KPIs in real time and fine-tuning equipment utilization.

2. Condition-based and predictive maintenance

Industrial IoT sensors enable real-time monitoring of equipment conditions, such as temperature or vibrations. During this process, machine learning can be utilized to detect anomalies and take actions based on the telemetry data.

By utilizing event-driven architectures, the system can trigger an alert, such as recommending maintenance operations, before a failure actually occurs. As a result, this can ensure a safer working environment and lower repair costs. In this regard, Microsoft estimates a 28% increase in condition-based maintenance investments by manufacturers over the next three years. A similar figure (26%) is expected for predictive maintenance.

To facilitate the adoption of IIoT in this specific business function, several service providers offer comprehensive cloud-based solutions for predictive maintenance, including Amazon. (This and this reference cards from Amazon may be helpful.)

An example of an IoT architecture for predictive maintenance (source: Amazon)

3. Energy management

A manufacturer can also monitor the energy consumption of industrial equipment through a wide network of smart meters deployed across the plant. This allows to improve transformers’ efficiency, minimize load losses, or even predict consumption patterns and peak loads—optimizing energy procurement and distribution. McKinsey found (2021) that IoT implementation can help organizations enhance energy efficiency by up to 50%.

4. Asset tracking

In manufacturing, the Industrial Internet of Things can also be used to monitor the current location of numerous assets distributed across warehouses. To implement this goal, IIoT can engage radio-frequency identification (RFID) tools.

This technology enables manufacturers to label their inventory items with a tag containing essential data. Then, it is possible to track the location or flow of assets via multiple RFID readers connected to an IoT system (like GE did here). According to Microsoft, inventory management and location tracking are among top 3 scenarios for IIoT products—along with monitoring dashboards.

5. Digital twins

As the name suggests, a digital twin acts as a virtual copy of a factory or specific equipment. It is a representation of the key assets and processes used by manufacturers for simulation, supervision, and testing of industrial objects. The role of IIoT here is to connect the plant (or equipment) to its digital counterpart, collecting data on-site and translating it into virtual representations.

This enables manufacturers to remotely monitor machinery condition and workflows in real time. The technology can also be used to share information such as inventory levels among procurement partners and distributors, optimizing the supply chain.

Despite the benefits, Microsoft reports that digital twins are still not so widespread—due to a combination of technical complexities, integration challenges, and skill gaps. However, the study suggests that manufacturers are expected to increase investments in this regard by 26% over the next three years.

The Industrial Internet of Things examples in real life

The successful applications of the Industrial Internet of Things can help to better understand the impact of this technology on manufacturing.

Unilever

This global provider of consumer goods implemented an IoT platform to create a digital twin of the company’s facilities and improve control over operations. After launching a pilot project at its factory in Valinhos (Brazil), Unilever was able to optimize soap and ice cream production, as well as reduce energy costs by $2.8 million.

Microsoft

Microsoft deployed an Industrial IoT platform for its own plant in Suzhou (China). The system is powered by machine learning, gathering and analyzing inventory data. As a result, the platform facilitated the identification of stocks on the verge of obsolescence and helped to cut inventory costs by $200 million.

A sample IIoT architecture for monitoring equipment condition (source: Microsoft)

Texmark

This US-based petrochemical company implemented an IIoT system introducing sensors and data analytics to monitor its plant and minimize the risk of human error. After adoption, planned maintenance costs were reduced by 50%.

Tenaris

Tenaris, a global manufacturer of steel pipes, partnered with ABB, a major electrical equipment producer, to build an IoT-based predictive maintenance solution. Deployed at Tenaris’s plant in Italy, the system relies on a network of IoT sensors to monitor 460 electric motors. The goal is to detect vibrations or power anomalies that may be signs of impending failures—therefore, reducing the frequency of incidents.

5 major IIoT adoption challenges and guidelines

Implementing and scaling IIoT in manufacturing is a rewarding, but a challenging journey, encompassing both technology and organizational aspects. Here are some guidelines to avoid the common missteps.

1. Legacy systems and integration

The report by Bain & Company names the complexity of integration the main barrier to scaling IIoT proofs of concept (September 2022). The layered architecture of an IIoT environment inevitably includes different types of software and hardware—hundreds or even thousands of apps and devices. One of the problems here is that each component of an IIoT system may rely on different network protocols and handle specific data formats.

In particular, HiveMQ reports that HTTP (55%), MQTT (48%), Modbus (41%), and OPC Unified Architecture (33%) represent the most common protocols for connecting equipment (October 2022).

The variety of communication protocols in IIoT today (source: HiveMQ)

This issue is further amplified by the presence of legacy systems utilizing outdated technologies. The combination of the factors above prevent the components of an IIoT platform from seamlessly operating as a whole. Consequently, organizations risk creating data silos or facing data consistency issues.

In an aim to bridge the systems, manufacturers try to integrate myriads of software and devices via application programming interfaces (APIs), if the latter are delivered out-of-the-box. If not, companies spend months developing APIs and brokers from scratch or finding other ways to transfer signals and information between equipment, production line, BI/ERP modules, etc.

(For instance, a recommends taking into account the following initiatives.

  • Protocol standardization: On the one hand, early adoption of IIoT devices sharing the same communication protocols can simplify integration. On the other hand, in reality, we will deal with legacy IoT devices sooner or later, making it necessary to create an Enterprise Service Bus (ESB) or its analogue at some point. This architecture represents a centralized middleware to convert different communication protocols, connecting legacy devices via customized adapters. McKinsey also highlights that a focus should be “placed on having correctly labeled data (especially time stamps) to make use of data.
  • API development: McKinsey emphasizes microservices (below) and APIs as “the key to developing a technical platform capable of supporting the level of flexibility and agility needed.” APIs can be created from scratch and may involve the adoption of tools that automate the development. Turning to a most popular type. For IoT integration, Red Hat recommends REST APIs—due to their lightweight nature and scalability.

The systems IIoT platforms need to integrate with (source: HiveMQ)

  • Microservices: Integrating and scaling complex IIoT systems requires continuous changes and upgrades. However, ongoing maintenance can disrupt the system as well as the processes relying on it. As a response to this challenge, microservices can replace typical monolithic architectures with modular, loosely coupled units that work independently without affecting the other components. One of the ways to amplify the ease of integration and scalability ensured by microservices is implementing the serverless model. Here, a cloud provider takes care of the underlying infrastructure on which a particular microservice/function runs, reducing operational efforts.

2. Data management

Manufacturers scaling IIoT may struggle to handle data due to its volume and complexity. These two parameters refer to the presence of huge data sets with their own native formats and flowing from disparate sources.

To properly store and analyze this information, an IIoT platform will need a suitable data storage. The system selected should be able to easily scale, manage time series data, and support flexible schemas (the way information is organized within the database). Here are a few options mentioned by McKinsey and Amazon to choose from:

  • NoSQL databases. While SQL databases store information based on well-defined schemas, their NoSQL counterparts organize data into more flexible structures. This enables NoSQL databases to store complex formats, such as unstructured data, making them ideal when scaling IIoT across multiple use cases, as pointed out by Microsoft.
  • Databases as a service (DBaaS). It is a licensing and delivery model in which a cloud provider offers highly scalable storage and computing resources on demand. Manufacturers prioritizing scalability for their IIoT solution may find DBaaS a very cost-efficient option.
  • Time-series databases. These are optimized to store and query time series data, which is a very common data type in manufacturing scenarios (for example, temperature trends in industrial machinery).
  • Data warehouses. It is a type of an enterprise system designed to unify the most critical information from multiple sources for data querying, consistency, and other purposes. This architecture involves ETL (extract, transform, and load) and data quality processes, and may become a key component in business intelligence.
  • Data lakes. Unlike data warehouses, these systems have no strict requirements in terms of data formats to ingest. They can store structured data for analytics, but can also be used as cheap repositories to keep unstructured information that might be useful in the future.

When it comes to data management, another issue to consider is the so-called data drift. This phenomenon entails a progressive change in the data collected by IIoT  sensors due to firmware upgrades, device replacements, or feature modifications. The result can be a degradation in the accuracy of data analysis, as it’s based on incoherent inputs.

A solution to data drift comes from machine learning. Scientists have already developed several ML algorithms to facilitate drift detection, and a variety of providers offer ML-powered tools to help in this regard, such as Azure Machine Learning.

An example of an IIoT architecture with multiple integrations needed (source: McKinsey)

3. Security

IIoT platforms can have several points of vulnerability due to the reliance on remote connections and the multitude of integrated devices. Furthermore, legacy systems are usually more prone to risk. To protect sensitive data and assets, a reliable IIoT solution will require solid cybersecurity and monitoring measures. Here are some recommendations from Amazon:

  • Establish centralized security policies and perform regular audits on all the IIoT components—including sensors, edge gateways, and networks—to identify vulnerabilities.
  • Divide your IoT network into smaller groups of components. This approach (named microsegmentation) enables to isolate workloads from each other, reducing the potential scope of a cyberattack.
  • Safeguard your IIoT platform with activity monitoring, risk analytics, security information and event management (SIEM), encrypted data exchange, identity and access management (IAM), as well as network segmentation through virtual private networks.
  • Ensure safer communication between IIoT platform and edge devices through data exchange standards and protocols designed to strengthen security. These encompass the OPC Unified Architecture, along with CIP Security, Modbus Secure, and more.
  • Define a disaster recovery plan and update it based on security incidents. This may include reliance on cloud services to ensure data backup and business continuity.

4. Digital twin implementation

According to Microsoft’s survey, digital twins share several challenges with other IIoT use cases, including integration, skill gaps, data volume, and data quality. However, the adoption of a digital twin can also be hampered by other building and scaling IoT scaling problems due to its overall complexity.

The challenges of adopting a factory digital twin (source: Microsoft)

That’s because providing a realistic representation of a factory requires a constant stream of real-time data. This information should be collected with a sprawling network of sensors, covering every single manufacturing operation (in an ideal scenario). Furthermore, as you introduce new machinery to scale production, the IIoT network fueling the digital twin should scale, as well. Here’s what you can do to mitigate such issues:

  • When scaling your operations, consider creating multiple, interconnected digital twins (one for each business function, for example) instead of expanding the existing one. According to Dr. Emile Glorieux of the Manufacturing Technology Center, a modular system can help you to address potential failures, facilitate maintenance, and scale IIoT by adding new subparts.
  • McKinsey names edge computing one of the most efficient approaches to help with managing large volumes of real-time IoT data. By minimizing the physical distance between a data source (e.g., machinery) and the computing power, latency reduces, and the digital twin can portray production processes faster. On the other hand, maintaining or updating software on the edge can be quite hard, and may require specific techniques, such as containerization.
  • Connectivity can also be improved by utilizing the 5G network, which provides high-speed, low-latency communication among multiple assets. Real-time process control is one of the manufacturing use cases where 5G is expected to have a huge impact.
  • A 2022 survey by PwC suggests that low-code platforms can act as accelerators for IIoT implementation and scaling. These development environments provide visual tools to facilitate coding, minimizing programming efforts. In this regard, Mendix reported that 41% of manufacturers surveyed in June 2022 want low-code systems to integrate with shop-floor devices and systems, while 39% aim to connect with legacy systems. 43% of the respondents expect low-code platforms to provide manufacturing-oriented app templates.

5. Business and organizational complexities

Manufacturers should consider IIoT implementation as a digital transformation impacting multiple business processes across their organizations. McKinsey highlights that you should be ready to handle these challenging implications:

  • Use case identification. Prioritize IIoT use cases that are likely to ensure the maximum ROI in the shortest period of time. For example, you may target essential business functions suffering from major inefficiencies. Or, start with noncritical processes presenting an opportunity of fast implementation, and then scale as these scenarios succeed. Besides, consider the expertise and resources required for such use cases, along with their scalability and replication potential across multiple locations to foster digital transformation with IoT.
  • Phased rollout. To mitigate IoT scaling problems, McKinsey recommends a phased deployment across multiple waves. The rollout should be carefully planned by setting up a suitable IIoT implementation roadmap to define use cases, timeline, and responsibilities. The first results of the pilot use cases can be observed within 6–8 weeks, enabling to collect feedback from initial deployments and adjust the adoption plan.
  • Upskilling. This aspect of IIoT implementation and scaling should address the lack of expertise required to leverage the technology. The first way to fill skill gaps is through role transitions, facilitated by proper training initiatives. The second is via external partnerships by involving experts in relevant fields, such as data science or Cross-team collaboration to streamline IoT scaling (source: McKinsey)

Allies, expertise, and a good plan

Fostered by a progressive reduction in sensor deployment and data storage costs, along with improved connectivity based on Bluetooth, Wi-Fi, and 5G, Industrial IIoT has made great strides in manufacturing and shows no signs of stopping.

Sometimes, however, “the enemy is within the gates,” as Cicero used to say. When it comes to deploying and scaling IoT, this can be a mix of nonstandardized protocols, integration issues, brand-new processes, as well as the lack of resources or expertise.

Still, you can overcome IIoT implementation challenges with a solid adoption strategy, POCs, KPIs, upskilling investments, and support from partners. This will help you to reap the benefits of this powerful technology.

Further reading


The article was written by Andrea Di Stefano, Alex Khizhniak, and Sophia Turol.

The post Scaling Industrial IoT in Manufacturing: Challenges and Guidelines first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
https://www.altoroslabs.com/blog/scaling-industrial-iot-in-manufacturing-challenges-and-guidelines/feed/ 0
Outsourcing Software Development to Argentina or Brazil: What’s the Difference? https://www.altoroslabs.com/blog/outsourcing-software-development-to-argentina-or-brazil-whats-the-difference/?utm_source=rss&utm_medium=rss&utm_campaign=outsourcing-software-development-to-argentina-or-brazil-whats-the-difference https://www.altoroslabs.com/blog/outsourcing-software-development-to-argentina-or-brazil-whats-the-difference/#respond Wed, 22 Jun 2022 10:37:12 +0000 https://www.altoroslabs.com/blog/?p=41 Outsourcing to Argentina offers affordable access to software development talent regarded for delivering cutting-edge innovation.

The post Outsourcing Software Development to Argentina or Brazil: What’s the Difference? first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
Outsourcing to Argentina offers affordable access to software development talent regarded for delivering cutting-edge innovation.

Demand for software outsourcing

Many IT companies in major markets are struggling to access the precise software development expertise that is required at a given time. The hiring and full onboarding of in-house talent typically requires significant costs and could take up to a year. Moreover, local employment legislation is placing pressure on these companies to retain talent on a long-term basis despite the lack of sufficient work loads. As such, outsourcing has become the modern cost-effective solution for IT companies seeking a range of necessary software development talent. In this article, we examine Brazil and Argentina as prospective destinations that help companies to meet growing demand for software development outsourcing.

Software outsourcing to Argentina or Brazil

IT companies are recognizing the long-term value of building a startup in Latin America or utilizing the talent in the region. In particular, the two frontrunners you must consider when selecting your destination for software development outsourcing are Argentina and Brazil. These two countries have booming technology industries and are leaders in Latin America for software development.

Argentina is the region’s third largest economy. The country has highly skilled developers and a regulatory framework that supports software outsourcing. The country offers domestic technological services that have a high penetration rate in international companies. To that end, software development outsourcing is considered as one of the sectors that bring major trade opportunities to the country.

Brazil, on the other hand, is the largest economy in Latin America. According to the United Nations Conference on Trade and Development (UNCTAD), the country was the sixth largest destination for global Foreign Direct Investment (FDI) flows in 2019. In particular, software development is a booming market in Brazil, generating over $10 billion in 2021.

According to the World Bank, Brazil and Argentina are similarly situated in a global ranking of 190 countries for ease of doing business.

Country Rank out of 190 Doing Business (DB) Score
Argentina 126/190 59.0/100
Brazil 124/190 59.1/100

Note: The top country was New Zealand with a score of 86.8.

Argentina or Brazil: Which is better for software outsourcing?

Your company may have opportunities for software development and IT outsourcing to Brazil and Argentina. However, choosing which out of the two is more suitable for your company requires an understanding of some nuances. Choosing a software development provider in these regions requires consideration of four key factors:

  • a macro environment for software outsourcing
  • knowledge and competence of software developers
  • interpersonal and working relationship
  • software engineer salary

1. The macro environment

In Argentina, government support for the IT market has become very active. IT education was a major focus for the Federal Education Council in 2019, which has resulted in the production of highly qualified and globally competitive software developers.

The government has also commenced programs to improve the skill set of Argentinian software developers and make the country a regional leader. Key software engineer programs also exist such as Startup Buenos Aires, which can connect a startup with either local or international investors. IncuBAte is another project that helps strengthen and consolidate innovative enterprises in Buenos Aires.

Today, an estimated 200,000 software developers reside and work in Argentina. Approximately 10% of Argentinian exports now consist of tech services, half of which are exported to the US. With that, the number of unicorns calling Argentina home is now increasing. Some of the unicorns originally from this country include ​​Globant, MercadoLibre, OLX, Auth0, Bitfarms, Uala, and TiendaNube.

In Brazil, tech companies also receive strong support from the government. A Brazilian law commonly known as Lei do Bem provides benefits such as reduced corporate income taxes and net income social contributions for businesses associated with technological innovation.

So, Brazil is commonly known as a rising central hub for custom software development with over 500,000 developers in its workforce. With an association like Anprotec promoting the development of innovation ecosystems in the country, Brazil has approximately 369 business incubators and 100 technology park initiatives. As a result, it is home to most of the top 10 unicorns coming from Latin America, presenting strong potential to build a software company in Brazil.

Top 10 Latin American unicorns based on market value in 2022 (Image credit)

2. Competence of software developers

Will the local software engineer you hire be qualified? Yes, the chances of hiring a knowledgeable and competent software developer is high. Both Brazil and Argentina have a growing focus on IT education and training, which is contributing toward the growing IT workforce in Latin America.

In terms of education, three of the top 10 universities in Latin America are located in Brazil and one is in Argentina. Based on another ranking of 1,750 world’s best universities for computer science courses, Brazil outperforms and outnumbers Argentina.

University Location Rank out of 1,750
University of Sao Paulo Sao Paulo, Brazil #135
State University of Campinas Campinas, Brazil #213
Federal University of Minas Gerais Belo Horizonte, Brazil #271
University of Buenos Aires Buenos Aires, Argentina #496

However, according to the report by Coursera Global Skills Index (GSI), Argentina outranks Brazil when it comes to technological competence. The report’s criteria for ranking technological capability between countries consisted of computer networking, operating systems, human computer interaction, databases, security engineering, and software engineering.

According to the same report, Argentina was number one in terms of technological skills and regarded for delivering cutting-edge innovation. Specifically, Argentina scored 100% for software engineering and 95% for operating systems. Brazil, however, only ranked 30 and was classified as delivering “competitive” rather than “cutting-edge” innovation.

Argentina tops the 2020 Coursera GSI global ranking for technology (Image credit)

3. Working relationships with software developers

How easy or difficult would it be to collaborate with Argentinian and Brazilian developers? Both countries are regarded for having strong English-speaking skills, which facilitate team communications.

However, Argentina is top ranked among Latin American countries. Indeed, it is the only Latin American country classified as having High Proficiency. To compare, Argentina was ranked #30 with a score of 556 (High Proficiency), while Brazil was ranked #60 with a score of 497 (Low Proficiency).

Argentina ranked #30, while Brazil ranked #60 for English proficiency (Image credit)

The time zones in both countries are also favorable to western companies, presenting nearshore opportunities for tech startups based in the US. To note, New York’s time zone is GMT-4, and California’s time zone is GMT-7. Brazil’s time zones are GMT-2, GMT-3, GMT-4, and GMT-5, with Sao Paulo being an hour ahead of New York and four hours ahead of California. Argentina’s time zone is GMT-3, similarly having an hour time difference with New York and a four-hour time difference with California.

Having Argentinian software developers who are proficient in English proves to be very practical in establishing and maintaining clear communication practices.

4. Software developer salary

In the US, the average income of a software development engineer is rising. A typical tech company might pay an average annual developer salary of $133,000 for back-end, $120,000 for full-stack, and $120,000 for mobile expertise. These figures are based on a survey by StackOverflow involving over 80,000 developers.

Average salaries of software developers in the US (Image credit)

Hence, the biggest question of all is, how much savings could companies in the US generate by outsourcing software development to Brazil or Argentina? The answer is—a significant amount. The annual salary of Argentinian and Brazilian software engineers, as provided by Glassdoor, could be less than half of their US-based counterparts.

In particular, the software engineer salary is smaller in Argentina than in Brazil, offering custom software companies a bigger opportunity to cut down on costs without compromising the quality of the work delivered.

Role Average software developer salary per year(in US dollars)
New York, USA California, USA Brazil Argentina
Senior software developer $112,677 $118,003 $64,680 $56,537
Software developer $106,333 $113,363 $40,296 $36,505

Key comparisons and conclusion

In summary, the two countries can be compared as follows in the table below.

Factor Brazil Argentina
Approximate number of software developers 500,000 200,000
Number of top ranking schools for computer science 3 1
English proficiency ranking 60/112 30/112
Time zones GMT-2, GMT-3, GMT-4 and GMT-5 GMT-3
Ease of doing business 59.1/100 59.0/100
Average software engineer salary $40,296 per year $36,505 per year

Brazil has a more established IT infrastructure, as evidenced by the fact that most of the top Latin American tech unicorns are here and that more top-ranking universities produce globally qualified software developers. The talent and skill level among Brazilian software developers cannot be denied, which, however, could also justify why Brazil presents a higher average software developer salary.

On the other hand, Argentina is globally recognized for being the top country in terms of delivering innovative, cutting-edge software engineering services. It also has a much higher English proficiency ranking and a more affordable software engineer salary than Brazil. Therefore, it presents a strong alternative for software development outsourcing.

Outsourcing opportunities in Argentina

The volume of completed projects by Altoros offers evidence of the robust software development outsourcing industry in Argentina. We have 50 employees and two well-established delivery centers in Buenos Aires and Santa Fe, which, to date, have completed 196 projects.

The company has well-selected talent in place, having software engineers who are bilinguals with a B2 English level (upper-intermediate) or higher and have an average experience of 6.5 years. 92% of our engineers also hold a bachelors, masters, or doctorate degree either in computer science or math.

With an in-house talent pool consisting of 56% senior-level and 44% mid-level software developers, Altoros has been recognized by Clutch as one of the leading software development companies in Latin America.

We have complete confidence in the experience and competence of our Argentinian software development team. As such, we offer a free two-week, no-risk trial period for newly onboarded tech companies.
Visit our website to learn more about our Argentinian delivery centers.

The post Outsourcing Software Development to Argentina or Brazil: What’s the Difference? first appeared on Altoroslabs Technology Blog | Latest News in Custom Software Development.

]]>
https://www.altoroslabs.com/blog/outsourcing-software-development-to-argentina-or-brazil-whats-the-difference/feed/ 0