Loading...

Follow Network Communications News on Feedspot

Continue with Google
Continue with Facebook
or

Valid

SDChain Alliance, the world’s first public Blockchain based on international standard truly adopted to ensure the reliability of the IoT, has signed a Memorandum of Understanding (MoU) with Guangzhou Robustel Co, an Industrial IoT/M2M Hardware and Solutions Provider.

SDChain Alliance proclaims to make full use of blockchain for IoT applications. SDChain says it envisions that IoT data from the physical world should be sharable via a fast and cost-effective digital blockchain network where data producers and data users conduct digital asset exchange, within an open partnership ecosystem, based on globally standardised IoT six-domain model.

The full realisation of IoT’s potential depends largely on interoperability among IoT systems. According to a McKinsey study, by 2025 the number of devices connected to the Internet will reach 25 billion, and the output value will reach USD 6 trillion, with the associated economy going up to USD 36 trillion. Research and Advisory firm ISG also believes that the benefits of IoT are undeniable. It can bring much of the physical world – from industrial assets to commonplace devices to people – into a connected ecosystem, resulting in better business outcomes.

Viewed against this background the emergence of the SDChain Alliance assumes greater significance. “The distributed network architecture of blockchain technology provides a mechanism to maintain consensus among IoT devices, without the need for verification with the centre to address security threats and protect user privacy. Blockchain technology can provide point-to-point direct interconnection for data transfer, greatly reducing the cost of computing and storage.” says by Dr. Shen, chief scientist of SDChain.

Commenting on the MoU, Mr David Pan, CEO of SDChain Alliance, observes, “SDChain Alliance is proud to be strategically aligned with Robustel, which possesses strong technical strength and rich market experience. Through leveraging the value of the blockchain, we will explore the potential of innovative applications with IoT and blockchain technologies together.”

The post SDChain Alliance partners with Robustel appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

As part of its plan to start limited deployments in dense urban areas, Vodafone announces that it will begin testing next-generation 5G mobile networks in seven of Britain’s largest cities.

With testing due to start in the latter part of 2018, the management of global operator Vodafone Group has signalled it is by no means in any rush to roll out 5G; seeing the potential tenfold increase in mobile broadband speeds it offers as an initial way to wire greater efficiencies out of network choke points.

As the third largest mobile operator, Vodafone UK say it will be laying the groundwork for 5G services as a supplement to already existing 4G networks – these will be installed in 40 locations surrounding Birmingham, Bristol, Cardiff, Glasgow, Liverpool, London and Manchester.

Vodafone, in a bid to target highly trafficked urban areas such as sports venues, offices, factories and hospitals, says its latest trials of 5G will commence between October and December 2018.

This may seem risky to some considering rival mobile operator EE, a unit of BT Group, has just released its plans to get the ball rolling by switching on the UK’s first 5G trial network in East London in October.

Adding to the reasoning behind its delay to switch on, Vodafone acknowledges its broad program of 4G network upgrades undertaken across Europe in recent years –combined with its strong position in licensed radio airwaves – ensures that it has capacity to meet traffic demand for years to come.

The company says it is waiting until 5G-ready phones and other devices start to become available in the 2020 timeframe before it considers fuller 5G deployments. Vodafone has forecast that 5G is unlikely to be adopted by 50% of phone users before the middle of the next decade, or around seven years out. But, all things considered, could this be a risky option?

The post Vodafone sets 5G trials but is in no rush to roll it out appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

ExtraHop announces Reveal(x) to be launched in summer 2018, which the company say is setting a new bar for Network Traffic Analytics at enterprise scale. The maker says its latest release includes new capabilities designed to modernise enterprise security operations with critical asset behaviour analysis that instantly surfaces the highest-risk threats, even those hiding within encrypted traffic. With this high-fidelity insight, ExtraHop has professed that security operations teams can zero in on critical threat patterns and investigate down to the root cause in seconds, not days.

Between 2017 and 2018, threat dwell time in the enterprise increased to 101 days, according to FireEye’s M-Trends 2018 Report. The Verizon Data Breach Investigations Report noted, ‘in many cases, it’s not even the organisation itself that spots the breach — it’s often a third party, like law enforcement or a partner. Worst of all, many breaches are spotted by customers.’

The company says its Reveal(x) release will significantly reduce dwell time by highlighting late stage attack activities, shining light on the ‘darkspace’ in the enterprise – the hard-to-reach areas of the network along the East-West corridor. Through comprehensive network traffic analytics, Reveal(x) is set to deliver real-time visibility and high-fidelity insight into threats to your critical assets throughout the hybrid enterprise. The new ‘headlines’ dashboard prioritises speed and accuracy, eliminating the fake news fire drills from other tools by highlighting the highest-risk detections correlated with external and industry threat intelligence. The company has announced that other key new features in the summer 2018 release include:

  • TLS 1.3 support: As of 2017, 41% of cyber-attacks used encryption to evade detection, so the ability to detect threats within encrypted traffic is even more critical. With the latest release, Reveal(x) claims to be the only solution that offers out-of-band decryption at up to 100 Gbps and supports the requirements of the new TLS 1.3 protocol as well as decryption of perfect forward secrecy.
  • Need-to-know decryption: Respect for privacy is simple now that authorised threat hunters and forensic investigators can be given rights to look inside suspicious packets for authoritative evidence (including content and user information), while other analysts only see the detections and metadata insights gleaned from the decrypted traffic.
  • Network privilege escalation detection: Reveal(x) is said to identify changes to behaviour that indicate an attacker has compromised a device, escalated access rights, and is using these higher privileges to explore and attack within the enterprise. The company say that Reveal(x) now infers escalation attempts on critical assets automatically based on changes in device behaviour, commands, and protocol use, enabling detection of attacks underway and allowing SecOps teams to contain them before damage is done.
  • Peer group anomaly detection: Reveal(x) claims to automatically correlate device behaviour against peer devices for more precise assessment of anomalous behaviour, leveraging auto-discovery and classification of critical assets. This strong outlier validation, the company says, improves insider threat and compromises host detection and enriches Reveal(x)’s investigative workflows with critical asset context that helps SecOps collaborate

“Today’s threat actors are taking advantage of vast attack surfaces that extend across every endpoint from the branch office to the datacentre or the cloud and too often they operate unnoticed,” says Jesse Rothstein, CTO and co-founder, ExtraHop. “At ExtraHop we’ve spent years developing technology that can analyse the entire network in real time – every critical asset and every transaction so that there are no blind spots. With Reveal(x) Summer 2018, we’ve applied that deep domain expertise to security operations, closing the visibility gap and surfacing the accurate, targeted information that allows SecOps teams to act quickly and with confidence.”

The post ExtraHop Reveal(x) sheds light on the darkspace appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

John Finch at RingCentral analyses new Altimeter report findings, discussing how digital transformation is making contact centres unified, collaborative, and intelligent.

Traditional contact centres as we know them were designed and deployed at scale in the 1960s, largely based on outdated management and governance models, with standards no longer applicable.

Today’s modern customer expects to communicate the way they find most convenient to them, including through phone, web, mobile, text, and social media. They demand higher levels of personalisation, immediacy, and convenience. But on-premises technologies are, by design, not unified, omnichannel, or real time, and as a result fall short in an always-on, hyperconnected world.

A recent report from Brian Solis commissioned by RingCentral, Contact Centre 2.0: The Rise of Collaborative Contact Centres industry report, demonstrates the need for digital transformation of the established contact centre model to a truly customer-centric, modern approach.

Based on a survey of 500 knowledge workers in the US and UK, the findings outline the decline of traditional contact centres operating on-premises technologies, work siloes, and limited and dated customer channels. Instead new cloud models enhance collaboration among experts and contact centre agents, providing first contact resolution to today’s modern customer. Companies are adopting cloud-first contact centres strategies in order to be more competitive in the market.

  • 86% aim to complete a transition within three years to Contact Centre 2.0.
  • 61% of companies have or plan to fully transition to the cloud.

Meeting the needs of today’s customer is more critical than ever.  It’s not just about delivering exceptional experiences in order to establish a competitive advantage, it’s about providing unique and fast resolution to the modern customer in order to meet evolving standards of service excellence. The technology is here today with the cloud and it is creating unprecedented opportunities for customer experience innovation.

The post The rise of the collaborative contact centre appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The fax machine – a piece of technology which heritage traces back to well over 150 years to 1842 – in many offices, is placed in a corner and not really thought of twice; until, of course, it makes an alarming noise and people have, by accident, called a fax line instead of the phone. Though our relationship with these machines today may be unapologetically indifferent, there is more to the fax that meets the eye.

As BT, like telco operators across Europe, continues to implement its plans to switch off the analogue, PSTN/ISDN telephone networks, it is actually the large number of fax users rather than voice customers that are likely to feel the most impact. While awareness of the switch off was pegged at a lowly 25% of UK businesses in a TalkTalk Business survey last year, what discussion there has been has often focused around the migration to Voice over IP. In reality, it is arguably the move from the traditional fax machine or multifunction devices equipped with fax cards that presents the prime opportunity or threat for many organisations.

“Any organisation still using fax extensively in 2018, probably has it deeply ingrained in its communications culture and business processes,” says Stéphane Vidal, vice president of marketing and communications at XMedius. “Until now such entities have had two options ahead of the BT analogue switch-off scheduled for 2025. The first and simplest was to move, sooner rather than later, to a Fax over IP solution, that offers significant and immediate cost and efficiency savings. The alternative was to invest in reengineering the culture and redesigning the processes to incorporate new ways of working and communication. Now customers can implement both options in parallel.”

He continues, “The NHS, in particular, along with other public-sector organisations and industries such as banking, insurance and legal, are still heavy users of fax. Indeed, it has been widely reported, following research released from DeepMind Health, that the NHS is the largest buyer of traditional fax machines in the world. Fax has maintained a stronghold in these areas as a result of familiarity, convenience, ubiquity and the legal standing of the audit trail it provides. While seven years may seem a long time, BT will stop selling new lines in 2020, so any organisation relying on fax should be planning its next move now.”

Independent industry analyst Rob Bamforth also identifies that fax is a potentially a forgotten player in the transition and that the deadlines demand attention, he addresses this saying, “As the analogue telco network switchover is looming, it would be wise to take another look at the fax sooner rather than later. Identify what use cases are required. Then consider a move to virtualise fax and shift from fax hardware and phone lines to fax software, networks and cloud services.”

Whichever route is chosen, multiple IP-based solutions are available that combine the ease of use and familiarity of fax, with best practice security and data protection principles.

By moving quickly to a Fax over IP (FoIP) solution such as XMediusFAX, Stéphane acknowledges that users can maintain established working practices, become more environmentally sustainable, comply with ever-increasing security requirements and also benefit from a rapid financial return through savings from reduced telco charges, paper and toner costs.

The post The fax, a technology that never really went away appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Rubrik has announced the release of Alta 4.2, the latest advancement in its cloud data management platform. With Alta 4.2, Rubrik says it is reinforcing its position as a single control plane for the enterprise hybrid cloud; with further support for private cloud, public cloud, and traditional data centre workloads. The company say users now have the ability to protect a broad set of cloud-native AWS EC2 workloads and traditional enterprise platforms like AIX and Solaris with the same data management software, helping organisations bridge the gap between existing and future infrastructures.

Rubrik identifies that most enterprises are interested in developing a hybrid cloud computing strategy to exploit the benefits of public cloud infrastructure as a service/platform as a service (IaaS/PaaS) for better elasticity, agility and overall operational efficiency. And, according to polling results from Gartner’s data centre infrastructure & operations management summit, 32% of IT leaders reported that developing a private/public/hybrid cloud strategy is the largest data centre (DC) challenge they face. Therefore, enterprises adopting the hybrid cloud look for flexibility to easily migrate workloads across environments based upon economic or business requirements.

“There is no question that hybrid cloud is the future, and adoption will be driven by both present workloads and future architecture,” says Arvind Nithrakashyap, chief technology officer at Rubrik. “We built Alta 4.2 to enable every aspect of an enterprise hybrid cloud strategy. Our goal is to make it simpler for every organisation to adopt the public cloud and to build an AWS-like, self-service cloud offering of their own. And we ensure everything is integrated in one, full-featured solution, so your infrastructure team has the same Rubrik experience whether you’re working with proprietary UNIX workloads or cloud-native applications.”

Cloud-native data protection through a single control plane

As enterprises deploy new, cloud-native applications for AI, real-time-analytics, IoT, and other use cases, Rubrik claims to offer Amazon Web Services EC2-native backup and lifecycle management to protect those applications in the cloud. The company say whether you deploy Rubrik in a data centre or on AWS, Rubrik Cloud Data Management can now index, catalogue, and protect any application running on Amazon EC2. Using AWS’s API-based functionality, Rubrik simplifies the backup and recovery process, making it possible to run everything on the cloud – production applications, source data, backup and recovery software, and archival storage.

Extending deeper into the enterprise data centre

The company recognise that the hybrid cloud also promises to bridge the gap between long-lived traditional data centre applications and the new infrastructure of the future. To this point, the company say that the Rubrik Alta 4.2 protects both IBM AIX and Oracle Solaris within the core product, offering enterprise operations teams the same suite of data management tools that they use for other physical, virtualised, and cloud-based workloads. Rubrik continue stating that organisations can begin moving some of these workloads into the cloud in a matter of hours, and for larger enterprises that have adopted Rubrik, supporting proprietary operating systems will make it easier to standardise on the Rubrik platform.

“We’re advocates of the modern Rubrik approach across our physical and virtual environments.” said Kevin Mortimer, infrastructure services manager at the University of Reading. “With Alta 4.2, we’re excited to see the platform extend even deeper into the data centre, further revolutionising our approach to backing up our data.”

The post Bridging the gap with Rubrik 4.2 appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

BeyondTrust, a cybersecurity company dedicated to preventing privilege misuse and stopping unauthorised access, has announced the results of the 2018 Implications of Using Privileged Access Management to Enable Next-Generation Technology Survey.

The company notes that the survey shows that 90% of enterprises are engaged with at least one next-generation technology (NGT), such as cloud, IoT, or AI. Yet, while enterprises are optimistic about the business benefits these technologies can bring, they also have concerns about the risks; with 78% citing the security risks of NGTs as somewhat to extremely large. One in five respondents experienced five or more breaches related to NGTs. Excessive user privileges were implicated in 52% of breaches.

Exciting times ahead

Now is the time to be involved in IT says BeyondTrust; next-generation, transformative technologies such as AI/Machine Learning and IoT, and business processes like DevOps are leading the way to a bright future full of operational efficiencies, greater business agility, and cost savings. Yet, the company acknowledge there is also a dark side to these NGTs: security vulnerabilities.

To better understand how security issues, such as privileged access management (PAM), affect the adoption of NGTs, BeyondTrust surveyed 612 IT professionals in 13 countries. The results are a wake-up call for anyone looking to leverage these NGTs.

DevOps has reached mainstream; AI and IoT not far behind 

The survey found broad interest in NGTs, with the most common being Digital Transformation (DX), DevOps and IoT.  IT reports these NGTs are important for organisations, with 63% saying Digital Transformation (DX) will have a somewhat to extremely large impact on their organisation, followed by DevOps (50%), AI (42%), and IoT (40%).

Significant movement toward the cloud 

The survey also found that cloud transformation is accelerating.  Respondents indicate that – today – 62% of workloads are on-premises, with 15% in a public cloud, 11% in private clouds, and 8% in SaaS applications. Over the next three years, that is projected to dramatically change: on-premises drops to 44%, public cloud jumps to 26%, private cloud increases to 15%, and SaaS increases to 12%.

One in five respondents experienced five or more breaches related to NGTs

Security issues, as a result of NGTs, happen at an alarming rate, with 18% of respondents indicating they had a breach related to NGTs in the last 24 months; which resulted in data loss, 20% experienced a breach that resulted in an outage, and 25% saw breaches over that time period that triggered a compliance event. One in five survey respondents experienced 5 or more breaches.

Too much privilege results in breaches 

The company say the study shows that more than half the time, these breaches occur due to trusted users doing inappropriate things for innocent reasons, with 13% of respondents indicating it happens ‘often’ or ‘all the time.’ In 18% of the cases, it’s trusted insiders going rogue, and in 15% of the cases, its outsiders gaining privileged access to steal credentials. In each case, excessive privileges are to blame.

There are real business costs that result from breaches. BeyondTrust say the top costs are: decreased productivity, loss of reputation, monetary damages, and compliance penalties.

Privileged access management can facilitate the move to NGTs

Respondents overwhelmingly indicate that PAM-related capabilities can improve security and facilitate a move to NGTs.  Top practices include controlling and governing privileged and other shared accounts (60%, 59%, respectively), enforcing appropriate credential usage (59%), and creating and enforcing rigorous password policies (55%). In fact, 100% of the survey respondents say they are employing at least one PAM-related best practice to avoid NGT problems with privileged access.

How privileged access management can enable the transformation to next-generation technologies

To improve security while reaping the transformative benefits that NGTs offer, organisations should implement five privileged access management (PAM) best practices that address use cases from on-premises to cloud.

  1. Discover and inventory all privileged accounts and assets. Organisations should perform continuous discovery and inventory of everything from privileged accounts to container instances and libraries across physical, virtual, and cloud environments.
  1. Scan for vulnerabilities and configuration compliance. For DevOps and cloud use cases, organisations should scan both online and offline container instances and libraries for image integrity.
  1. Manage shared secrets and hard-coded passwords. Governing and controlling shared and other privileged accounts represents one of the most important tactics organisations can employ to limit the effects of data breaches resulting from NGTs.
  1. Enforce least privilege and appropriate credential usage. Organisations should only grant required permissions to appropriate build machines and images through least privilege enforcement.
  1. Segment networks. Especially important in DevOps, lateral movement protection should be zone-based and needs to cover the movement between development, QA, and production systems.

“It is encouraging to see that organisations understand the benefits that Privileged Access Management can deliver in protecting next-generation technologies, but there are more best practices to employ,” says Morey Haber, chief technology officer at BeyondTrust. “The survey affirms that security should be at the forefront of new technology initiatives, otherwise, organisations can experience serious financial, compliance, and technological ramifications later on.”

The post Five pillars to secure DevOps, cloud, and IoT adoption appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Allot Communications Ltd, a provider of network intelligence and security solutions for service providers worldwide, releases findings from its Telco Security Trends Report. The company says the findings reveal a dynamic and automated threat landscape in which consumers lack the security expertise to effectively protect themselves.

The findings suggest that mobile and the Internet of Things (IoT) continue to be primary attack vectors, contributing to a spike in crypto jacking, adware, and distributed denial of service (DDoS) attacks.

The company say its Telco Security Trends Report is based on anonymous data gathered from four communications service providers (CSPs) across Europe and Israel – who, between them, protect seven million customers. It found that during the period from November 2017 to February 2018, nearly two billion mobile security threats were blocked – an average of two each day per mobile device.

Of those security protections:

  • Almost one billion were triggered by crypto mining malware, the leading security threat, corresponding to the rise in cryptocurrency valuation in late 2017/early 2018.
  • Over one hundred million threats were triggered by adware only.
  • Forty thousand threats were triggered by direct attacks in the form of ransomware and banking trojans.

The escalating IoT Threat Landscape?

As part of this study, Allot say they set up honeypots simulating consumer IoT devices and exposed them to the internet. The consequent results of this showed immediate successful attacks, peaking at a rate of over one thousand per hour, with findings revealing that a device can get infected within 42.5 seconds of being connected to the internet. There was also an increase of unique IP addresses attacking the honeypots over time, from 44 per day to a peak of 155 per day in less than a month of exposure.

Connected devices are forecast to grow to almost 31 billion worldwide by 2020. To help combat rising threats across this expanding mobile and IoT attack surface, the report found that CSPs are best positioned to deliver a unified, multilayer security service delivered at the network level to the mass market. By merging value-add network-based security with built-in customer engagement capabilities, the company say CSPs can simultaneously achieve rapid customer acquisition and high adoption rates of 40%, while generating incremental revenue.

“Cybercrime has become rampant across the growing mobile and IoT attack surface due to the financial motivation it provides” says Ronen Priel, VP product management at Allot. “CSPs can differentiate themselves from the competition by offering value-added security services to subscribers who are constantly under attack, while generating incremental revenue. It’s a win-win for both the CSP and subscriber.”

The post How to combat against the rising mobile and IoT threats? appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

With customer experience and convenience acting as major drivers for the adoption of biometrics it was only a matter of time until a company started to research the impact this could have on the current IoT and smart city movement. But the question on the lips of everyone here at NCN is how influential could the adoption of biometrics actually be?

With a focus on banking, Goode Intelligence publishes the results to its latest research project: ‘Biometrics for Banking; Market and Technology Analysis, Adoption Strategies and Forecasts 2018-2023 – Second Edition’. The company say the research identifies by the end of 2020, some 1.9 billion bank customers will be using biometrics to do a number of tasks, such as:

  • Withdraw cash from ATMs.
  • Prove their identity when contacting their bank via telephone (both actively and passively).
  • Prove identity for digital on-boarding using their face.
  • Access digital bank services through an increasing number of connected IoT devices including smart home devices and via connected car and smart city services.
  • Authenticate into a mobile bank app using their biometric, either using an embedded sensor or through an app or SDK.
  • And, use a combination of biometric modalities (face and voice for instance) to initiate money transfers when accessing web-based eBanking services.

Biometric vendors are experiencing tremendous growth on the back of the escalation of consumer-led adoption of biometric authentication. The adoption for banking purposes is a major contributor to this growth; and, Goode Intelligence forecasts that by 2023 it will contribute US$4.8 billion in revenue for companies involved in delivering biometric systems to the banking industry. During its research studies the company recognised that biometrics for banking is an increasingly vital part of a bank’s toolkit. With its never-ending task of reducing financial fraud and ensuring that customers can conveniently prove their identity; implementing a biometrics system can result in smarter identity verification and authentication for the customer-first bank.

Alan Goode, founder and CEO of Goode Intelligence and author of the report, says “Biometric technology is being rapidly deployed to support a wide range of banking services, from the traditional – ATMs and branches – to the new banking channels of mobile and IoT. Customer experience and convenience are major drivers for the adoption of biometrics by agile third parties wanting to differentiate their services with each other – it will be an ultra-competitive market and, biometric authentication could be a key differentiator.

“The emergence of new channels is being driven by the Internet of Things (IoT) and we are only at the beginning of a movement that allows bank customers to access banking services from a wide range of intelligent connected devices that include the smart home, smart car and smart city. The availability of secure banking APIs – part of the open banking movement –  is allowing third parties to integrate banking services into their devices and services, allowing bank customers to better manage their day-to-day finances. Biometric technology is fast becoming the glue that binds this technology together – passively verifying a person’s voice while they talk to their smart speaker – allowing them to pull up their latest account balance with a voice command.  Then actively requesting a face or palmprint when the bank’s risk engine decides that a money transfer request is outside the normal risk appetite. For instance, that ride-share through the streets of central London is a riskier transaction than the one initiated at home. This linking of fraud management, adaptive authentication and a choice of passive and active biometric tools, will be key for banks wanting to stay in the game.

“Of course, treating biometrics as an important tool for banks, rather than thinking of it as a silver bullet, is vital in securing digital transformation projects that leverage biometric technology are successful.”

The report investigates the current global adoption with market analysis including key drivers and barriers for adoption, interviews with leading stakeholders, technology analysis with review of key biometric technologies and profiles of companies supplying biometric systems to banks.

The post Banking on biometrics appeared first on Network Communications News (NCN).

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview