Can your cabling support the demands of the future?

Are you equipped to deliver the healthcare of the future? In the first of our five-part blog series, we explore key areas of consideration to help you make the decisions that will improve the lives of patients, doctors, and nurses.

Technical innovations are driving faster, more accurate diagnoses, streamlined care and better outcomes for patients. By 2021, the health technology sector is expected to reach $280 billion, according the 2019 US and Global Health Care Industry Outlook report by Deloitte*.

What’s more, Deloitte suggests the US healthcare industry is moving towards a model based on value rather than volume. This means keeping people healthy and out of the hospital will be key. Rather than seeing people as patients, healthcare providers should treat them more like members – a shift that could result in greater customer loyalty. The successful deployment and management of wireless technology can ease this transition by providing a reliable, always on network which is critical to the success of future digital tools, workflow and patient care.

But wireless data transfer is only as reliable and fast as the infrastructure that supports it. Data has to be funneled through a cable at some point, and you may find that your existing cabling infrastructure can’t keep up with the demands of the modern healthcare organization.

Too often, cabling is neglected when planning for new technology investments. The reality is that robust cabling is essential for the success of wireless technologies. It provides the reliability and performance that always-on healthcare networks demand. A 10G infrastructure provides the bandwidth to support the most demanding technology, delivering high-resolution imaging across a hospital in moments, while keeping patients and staff wirelessly connected, and medical records secure.

Not investing in physical infrastructure, may mean not getting the best from the wireless technologies that help deliver competitive patient care. Here are four use cases.

Fast and efficient data collection

Wirelessly connecting medical devices to Electronic Health Records systems has reduced the time it takes to enter vitals from 7-10 minutes to less than 1 minute per patient, according to Becker’s Hospital Review**. What’s more, having access to up-to-date test results and medical records electronically enables staff to provide more streamlined care.

Reducing errors

With the help of wireless technology, patient information no longer has to be interpreted and uploaded to a hospital database manually, significantly reducing the risk of errors.

Location tracking

Wireless technology offers the ability to track a patient’s location, providing a sense of freedom and security for those with long-term illness living outside a medical facility. For example, if a patient with Alzheimer’s diseases goes missing, they can be easily located.

It also provides better care in medical facilities. Wireless, wearable sensors track patient movement, alerting nursing staff when someone leaves their room or suffers a fall.

Remote monitoring

Wireless smart devices allow doctors to monitor patients remotely. Medical devices such as vital sign monitors and infusion pumps transmit data to electronic records, giving doctors remote access to critical information. What’s more, doctors can provide patients with advice via video conferencing.

These examples simply scratch the surface of what’s possible with wireless technology powered by high-performance cabling. Our solutions can serve as the backbone for platforms that improve the quality of care today and beyond.

Discover how Elmhurst Memorial Healthcare relied on Panduit’s Enterprise and Data Center Solutions to create a home for high-level medical services to grow and thrive. Learn more now.

https://www2.deloitte.com/us/en/pages/life-sciences-and-health-care/articles/us-and-global-health-care-industry-trends-outlook.html

**  https://www.beckershospitalreview.com/healthcare-information-technology/the-connected-hospital-wireless-technology-shapes-the-future-of-healthcare.html

Can your infrastructure meet the requirements of MiFID II?

With GDPR still a prevalent concern across the financial services industry, financial institutions face another major regulatory challenge in the form of the Markets in Financial Instruments Directive II (MiFID II). In the UK alone, the Financial Conduct Authority received 1,335 notifications of inaccurate transaction reporting under (MiFID II during 2018*).

The directive is multi-faceted. Ostensibly, the EU designed it to offer more protection to investors by introducing greater transparency to asset classes, whether they’re equities, fixed income, exchange traded funds or foreign exchange.

But this has consequences for your underlying networking infrastructure, which is required to support greater and more timely data transactions. This is especially pertinent for trading firms in the High Frequency Trading (HFT) sector, where trimming network latency by nanoseconds results in increased profits and competitive advantage.

With this in mind, MiFID II mandates latency standards across global banking networks. It also requires communication across those networks to be captured and recorded in real-time, and time-stamped accordingly.

Time stamping is a critical factor, requiring correct handling, with uniform latency across a network helping to create a consolidated view of network transactions which all carry accurate time-stamps.

There are certain technical standards for time-stamping that firms must meet under the new directive. Among these are: choosing the clock that you will use as a reference; indicating the type of organizations involved in a trade; defining the type of trade; and the level of time-stamp granularity -e.g. microseconds or nanoseconds. If you, as a trader, are dealing with a dual-listed, cross-border stock that covers two time zones, your infrastructure needs to be sufficiently uniform so you can document well and timestamp accurately. Once again, latency is the key.

The consequences are even fiercer than with GDPR, as non-compliant companies risk fines of up to €5m, or up to 10% of global turnover**. This is a concern for the 65% of capital market firms across Europe who stated in a 2018 survey that they had no adequate or systematic method in place to monitor trades in accordance with best execution criteria***.

Read this blog to find out how else you should be equipping your network infrastructure to ensure efficiency.  

*  https://www.ftadviser.com/regulation/2019/04/10/more-than-1-000-mifid-ii-breaches-reported-to-fca/

**  https://www.pwc.ch/en/publications/2018/solgari-industry-report.pdf 

***    https://www.finextra.com/blogposting/16488/mifid-ii—one-year-on

Latency is only the start of the challenge

There’s a clear need for a latency standard that can be applied globally across financial institutions. But that’s just one step. The real challenge emerges when you ask why this standard is necessary, and what it means for the future success of your business.

Latency is key to your success because if it isn’t perfectly calibrated, it’ll cost you. According to a study by the Tabb Group, if your infrastructure allows even 5ms of lag, you could lose an astounding $4m per millisecond across transactions.*

The reality is that the demand on your digital infrastructure has never been higher. We live in a world of high-speed financial trading. Data needs to be processed, analyzed, and transmitted at lightning speeds to meet the global, mobile, and 24/7 demands for instantaneous transactions and transfers.

Moreover, when positions change in an instant, latency isn’t just a matter of efficiency. It’s a matter of profitability. Which means that your infrastructure must be up to task if your institution is to remain viable over the coming years.

That’s why it’s vital to have a next-gen digital infrastructure architecture that’s robust and reliable. Joe Skorupa, VP Distinguished Analyst at Gartner Data Centre Convergence, recently commented*, “I have known major financial organizations make multi-million dollar investments only to rip-and-replace them the very next day if a technology comes along that improves their competitive edge.

However, the network hasn’t really changed in the last few decades because network folk are conservative. The reasons are quite clear: if a server in a data center fails, your application goes down; but if your network goes down your entire data center goes down.”

Skorupa highlights the latency issue right here. In order to benefit from super-speed transactions, and make the most of your digital transformation, you need to equivalize latency across your entire network. This involves taking an in-depth look at your existing physical infrastructure, and determining where change is required.

Upgrading and consolidating your data centre infrastructure can also help to mitigate risk, and future-proof the business, as this blog post explains [http://panduitblog.com/2019/04/29/datacenter/consolidation-the-pros-and-cons-of-putting-your-eggs-in-one-basket/].

As a trusted infrastructure partner, Panduit can help you tackle your latency issues, and ensure the right networking technologies are underpinning your financial services.

##

*Source: https://datacentrenews.eu/story/opinion-automating-the-data-center-with-ibn, October 2018

*Source: The Value of a Millisecond: Finding the Optimal Speed of a Trading Infrastructure, April 2008