Red Hat Enterprise Linux (RHEL) provides the stable foundation for advanced workloads both on-premises and in public clouds such as Microsoft Azure
For Azure, Red Hat and Microsoft have been working together to provide the tools needed to deliver workloads faster, with less effort - backed by tightly integrated, enterprise-grade support. Many of today's workloads require some form of enterprise database, and one of the most popular is Microsoft SQL Server.
We've given several presentations covering how using RHEL as the operating system platform for SQL Server helps organizations to reduce costs, avoid reliance on a single vendor, and increase performance. The response from SQL Server DBAs has been impressive with nearly one third of those surveyed by Red Hat and Unisphere Research saying that they were running the database on Linux today, and RHEL being the most popular distribution for deployment.
The Friday Five is a weekly Red Hat blog post with 5 of the week's top news items and ideas from or about Red Hat and the technology industry. Consider it your weekly digest of things that caught our eye.
- Software development in 2022: 5 realities CIOs should know
- Intelligent CIO - Employers centralizes insurance apps on Red Hat OpenShift
- LISTEN TO THE PODCAST: How Does Data Help Shape Movies?
- TrustyAI - an open source project looking to solve AI's bias
- IT Brew - All hands on deck: Gov officials call for cybersecurity help from.everyone
Read on for details
A new type of computer is being developed that can break many of our existing cryptographic algorithms.
As a result, we need to develop new algorithms that are secure against those computers and that will run on our existing computers. This is called "post-quantum cryptography".
What is a quantum computer?
In 1981, Richard Feynman proposed a new way to model quantum interactions in complex systems. There is a problem when modeling these interactions, however, in that we need to represent each linked particle as a set of probabilities. As we add particles, these arrays grow exponentially. For any sufficiently large system, we can no longer handle the storage and time requirements using existing computers.
The next decade will see giant leaps forward in 5G, edge computing, enterprise Linux and plenty of other areas. As organizations look at the opportunities ahead, they must weigh both the opportunities and the risks.
One such exciting area is artificial intelligence (AI). As the tools and methodologies advance, many organizations are looking to use AI to improve business efficiencies, bring innovation to customers faster, gain actionable market insights and more. However, the rush to put AI in place without always knowing what it can be used for, or how to use it, can lead to problems with the systems and the data itself. We have heard many stories of when AI makes the "wrong" decision due to built-in biases, and in some cases, the outcome can be life or death.
In a recent blog post, we talked about why many people are choosing to use cloud services instead of self-managed infrastructure.
According to a recent report, 68% of organizations are deploying application services in cloud environments.1 Using cloud services, including application and data services, helps teams focus on the work that's most important to them while trusted experts manage the infrastructure. Using Red Hat OpenShift cloud services―including application and data services like Red Hat OpenShift Streams for Apache Kafka, Red Hat OpenShift API Management and Red Hat OpenShift Data Science―helps organizations shorten development cycles.
See all Archived Red Hat News articles
See all articles from this issue