Related Content
The Importance of Data Classification in a Post-GDPR World Automated data classification can have a range of benefits. These include making it possible to organize and secure data for compliance purposes, assisting with deletion of data that is no longer needed, enabling monitoring and alerting, and reducing the cost of compliance. |
||
Why Use the MySQL Cluster Database If you need to provide a distributed database that serves several (millions) users, and has a high volume data load, the MySQL Cluster database is the database to use. |
||
The Rise of Security Challenges for the Data Cloud Data is getting dispersed across numerous databases, microservices, analytics tools, and pipelines. This results in increased security concerns that will soon become “top-of-mind” for engineering and security teams alike. |
||
Why use JSON Web Token (JWT) in Authentication JSON Web Token (JWT) is an open, JSON-based standard for securely transmitting information between parties. In addition to secure information exchange, JWT could be used for authentication. |
||
3 Serverless Strategies to Look for in 2021 In this article, we examine the three serverless applications deployment and development approaches that are transforming the application development process and acting as a catalyst for fast adoption of the DevOps practice across the board. |
||
What Exactly Is Serverless? The word serverless—it’s everywhere. The word has been Googled an average of 100 times daily in 2020. Is serverless just a buzzword? A facade? Or a world where we won’t need servers anymore? |
||
How to Make a Fixed-Scope Contract More Agile Establishing a contract that genuinely supports agile methods can be a significant challenge. By its very nature, a contract that specifies detailed, upfront deliverables contravenes the principles of flexibility and adaptation that are at the heart of agile. But it is possible—both parties just need to focus on results. |
||
Breaking Down Apache’s Hadoop Distributed File System Apache Hadoop is a framework for big data. One of its main components is HDFS, Hadoop Distributed File System, which stores that data. You might expect that a storage framework that holds large quantities of data requires state-of-the-art infrastructure for a file system that does not fail, but quite the contrary is true. |