Speaker Interview: Chris Nauroth

Continuing our series of interviews and viewpoints from speakers at Hadoop Summit Europe, this interview is with Chris Nauroth, Software Engineer, Hortonworks who – along with Arpit Agarwal, Hortonworks – will be speaking on “Interactive Hadoop via Flash and Memory” as part of the ‘Future of Hadoop’ track on Day 1, at 3:10pm.

You can register for Hadoop Summit here, and see the detailed schedule here.

HS: Tell us a little about your session.

Our session discusses two recent enhancements in HDFS that will help users get the most out of their clusters’ resources. Centralized Cache Management provides the ability to specify which HDFS files to retain in RAM at the DataNode. This reduces disk I/O and lowers read latency for frequently accessed files. Heterogeneous Storage provides the means to request storage of a file on a specific type of storage media, such as traditional HDD, SSD or RAM. This allows users to choose the storage media that best match their applications’ specific requirements for performance and durability.

HS: What made you want to talk about storage media for HDFS?

It’s an exciting time in the world of storage.  Traditionally, HDFS hasn’t differentiated between different types of storage media or supported any notion of cache hints.  These features help take HDFS to the next level of performance and expand its reach to support an even larger set of applications.

HS: What sessions are you most interested in seeing?

I’m interested in attending other sessions on HDFS architecture, such as the session on creating a standalone block management service to separate some of the concerns of the current NameNode.  I’m also interested in looking more closely at Tez and how it might drive different access patterns for HDFS files.

HS: Thanks! Good luck with your session, and we’ll see you in Amsterdam.

Chris Nauroth is a software engineer at Hortonworks and an Apache Hadoop committer. He is an active contributor to HDFS, YARN, and MapReduce. Prior to Hortonworks, Chris worked for Disney, where he deployed Hadoop, developed data management solutions on top of it, and was responsible for operational support.

Arpit Agarwal is a Member of Technical Staff at Hortonworks and an active Hadoop committer and contributer. His interests include distributed computing and software performance. He received his Masters in Computer Science from Georgia Tech in 2003.