Related Content
Breaking Down Apache’s Hadoop Distributed File System Apache Hadoop is a framework for big data. One of its main components is HDFS, Hadoop Distributed File System, which stores that data. You might expect that a storage framework that holds large quantities of data requires state-of-the-art infrastructure for a file system that does not fail, but quite the contrary is true. |
||
Comparing 4 Top Cross-Browser Testing Frameworks The market is flooded with cross-browser testing frameworks, with more options than ever before. How should you decide which option is best to test your application for compatibility with different web browsers? Let’s take a look at four of the top open source solutions today and compare their benefits and drawbacks. |
||
Lessons the Software Community Must Take from the Pandemic Due to COVID-19, organizations of all types have had to implement continuity plans within an unreasonably short amount of time. These live experiments in agility have shaken up our industry, but it's also taught us a lot of invaluable lessons about digital transformation, cybersecurity, performance engineering, and more. |
||
Comparing Apache Hadoop Data Storage Formats Apache Hadoop can store data in several supported file formats. To decide which one you should use, analyze their properties and the type of data you want to store. Let's look at query time, data serialization, whether the file format is splittable, and whether it supports compression, then review some common use cases. |
||
5 Pitfalls to Avoid When Developing AI Tools Developing a tool that runs on artificial intelligence is mostly about training a machine with data. But you can’t just feed it information and expect AI to wave a magic wand and produce results. The type of data sets you use and how you use them to train the tool are important. Here are five pitfalls to be wary of. |
||
Benefits of Using Columnar Storage in Relational Database Management Systems Relational database management systems (RDBMS) store data in rows and columns. Most relational databases store data row-wise by default, but a few RDBMS provide the option to store data column-wise, which is a useful feature. Let’s look at the benefits of being able to use columnar storage for data and when you'd want to. |
||
Choosing the Right Threat Modeling Methodology Threat modeling has transitioned from a theoretical concept into an IT security best practice. Choosing the right methodology is a combination of finding what works for your SDLC maturity and ensuring it results in the desired outputs. Let’s look at four different methodologies and assess their strengths and weaknesses. |
||
Comparing Apache Sqoop, Flume, and Kafka Apache Sqoop, Flume, and Kafka are tools used in data science. All three are open source, distributed platforms designed to move data and operate on unstructured data. Each also supports big data in the scale of petabytes and exabytes, and all are written in Java. But there are some differences between these platforms. |