Library Hours
Monday to Friday: 9 a.m. to 9 p.m.
Saturday: 9 a.m. to 5 p.m.
Sunday: 1 p.m. to 9 p.m.
Naper Blvd. 1 p.m. to 5 p.m.
     
Limit search to available items
Record 12 of 16
Results Page:  Previous Next

Title Data engineering foundations. Part 2, Building data pipelines with Kafka and Nifi. [O'Reilly electronic resource]

Edition [First edition].
Publication Info. [Place of publication not identified] : Addison-Wesley Professional, [2022]
QR Code
Description 1 online resource (1 video file (4 hr., 29 min.)) : sound, color.
Playing Time 042900
Description digital rdatr
video file rdaft
Instructional films lcgft
Series Live lessons
LiveLessons (Indianapolis, Ind.)
Performer Doug Eadine, presenter.
Summary 4+ Hours of Video Instruction The Perfect Way to Get Started with Data Pipelines, Kafka, and NiFi Data Engineering Foundations Part 2: Building Data Pipelines with Kafka and NiFi provides over four hours of video introducing you to creating data pipelines at scale with Kafka and NiFi. You learn to work with the Kafka message broker and discover how to establish NiFi dataflow. You also learn about data movement and storage. All software used in videos is open source and freely available for your use and experimentation on the included virtual machine. About the Instructor Doug Eadline, PhD, began his career as a practitioner and a chronicler of the Linux Cluster HPC revolution and now documents big data analytics. Starting with the first Beowulf How To document, Dr. Eadline has written hundreds of articles, white papers, and instructional documents covering virtually all aspects of HPC computing. Prior to starting and editing the popular ClusterMonkey.net website in 2005, he served as editor-in-chief for ClusterWorld Magazine and was Senior HPC Editor for Linux Magazine. Currently, he is a consultant to the HPC industry and writes a monthly column in HPC Admin Magazine. He has practical hands-on experience in many aspects of HPC, including hardware and software design, benchmarking, storage, GPU, cloud, and parallel computing. He is the co-author of the Apache Hadoop YARN book and author of Hadoop Fundamentals LiveLessons and Apache Hadoop YARN LiveLessons. Skill Level: Beginner Intermediate Learn How To: Understand Kafka topics, brokers, and partitions Implement basic Kafka usage modes Use Kafka producers and consumers with Python Utilize the KafkaEsque graphical user interface Understand the core concepts of NiFi Understand NiFi flow and web UI components Understand direct data movement with HDFS Use HBase with Python Happybase Use Sqoop for database movement Who Should Take This Course: Users, developers, and administrators interested in learning the fundamental aspects and operations of date engineering and scalable systems Course Requirements: Basic understanding of programming and development A working knowledge of Linux systems and tools Familiarity with Python Lesson Descriptions: Lesson 7: Working with the Kafka Message Broker In Lesson 7, Doug introduces introduce the Kafka message broker concept and describes the producer-consumer model that enables input data to be reliably decoupled from output requests. Kafka producers and consumers are developed using Python, and internal broker operations are displayed using the Kafkaesque graphical user interface. Lesson 8: Working with NiFi Dataflow Lesson 8 begins with a description of NiFi flow-based programming and then provides several examples that include writing pipeline data to the local file system, then to the Hadoop Distributed File System, and finally to Hadoop Hive tables. The entire flow process is constructed using the NiFi web Graphical User Interface. The creation of portable flow templates for all examples is also presented. Lesson 9: Big Data Movement and Storage Lesson 9 provides you with several methods for moving data to and from the Hadoop Distributed File System. Hands-on examples include direct web downloads and using Python Pydoop to move data. Basic data movement between Apache HBase, Hive, and Spark using Python Happybase and Hive-SQL is also presented. Finally, movement of relational data to and from the Hadoop Distributed File System is demonstrated using Apache Sqoop. About Pearson Video Training: Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Sams, and Que Topics include: IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more. Learn more about Pearson Video training at http://www.informit.com/video.
Subject Apache Kafka.
Application software -- Development.
Electronic data processing.
Logiciels d'application -- Développement.
Application software -- Development
Electronic data processing
Genre Instructional films
Internet videos
Nonfiction films
Instructional films.
Nonfiction films.
Internet videos.
Films de formation.
Films autres que de fiction.
Vidéos sur Internet.
Added Author Eadline, Doug, 1956- presenter.
Addison-Wesley Professional (Firm), publisher.
Added Title Building data pipelines with Kafka and Nifi
ISBN 9780138087029 (electronic video)
0138087024 (electronic video)
Patron reviews: add a review
Click for more information
EVIDEO
No one has rated this material

You can...
Also...
- Find similar reads
- Add a review
- Sign-up for Newsletter
- Suggest a purchase
- Can't find what you want?
More Information