Data Science at NVIDIA GTC 2020
October 9, 2020 | 9:00 AM–8:00 PM EDT | Virtual Gathering at NVIDIA's GTC
Where users, partners, customers, contributors and upstream project leads come together to collaborate and work together across the OpenShift Cloud Native ecosystem.
The event is over
OpenShift Commons: Validated Patterns SIG, Red Hat OpenShift and MLOps
In this session, we see the deployment of a machine learning model using data science pipelines and how we can set up these pipelines on OpenShift AI.
OpenShift Commons Virtual Meetings Kubeflow and the AI lifecycle with Ricardo Martinelli de Oliveira
OpenShift Commons Virtual Meetings Kubeflow and the AI lifecycle with Ricardo Martinelli de Oliveira
AI inference at the edge using OpenShift AI
Many organizations are looking to edge deployments of AI to provide real-time insights into their data. OpenShift AI can be used to deploy model serving, predict failure, detect anomalies, and do quality inspection in low-latency environments in near real-time. The demo shows how the model can be packaged into an inference container image and use data science pipelines to fetch models, build, test, deploy and update within a GitOps workflow. ArgoCD is used to detect changes and update the image at the edge devices if needed. Observability into the model’s health and performance is provided by gathering metrics from edge devices and reporting back to centralized management. Learn more: https://red.ht/openshift_ai
OpenShift Commons: A conversation about OpenShift LightSpeed with Erik Jacobs, Red Hat
OpenShift Commons: OpenShift LightSpeed with Erik Jacobs, Red Hat A conversation about OpenShift LightSpeed and demo session. Wednesday, July 17, 2024 Join our live sessions every Wednesday at 12:00PM - 1:00 PM EDT- 9AM PT https://Red.ht/commons-general Review our Agenda: red.ht/commons-GENERAL-agenda Find us at: https://commons.openshift.org/join
OpenShift Commons: Edge SIG: Model training in Red Hat OpenShift AI Prepare and label datasets
OpenShift Commons: Edge SIG - Model training in Red Hat OpenShift AI - Prepare and label custom datasets with Label Studio with Diego Alvarez Ponce, Red Hat · Prepare and label custom datasets with Label Studio https://developers.redhat.com/articles/2024/05/02/prepare-and-label-custom-datasets-label-studio · Model training in Red Hat OpenShift AI https://developers.redhat.com/articles/2024/05/02/model-training-red-hat-openshift-ai Tuesday, July 16nd, 2024 Review our Agenda and join our live sessions: https://red.ht/commons-edge-sig-agenda https://commons.openshift.org/join
OpenShift Commons: Presenting Edge AI Series (Edge SIG) and Validated Patterns SIG
OpenShift Commons: * Intro to Edge AI Series with Ben Cohen, Diego Alvarez Ponce and Myriam Fentanes Gutierrez * Intro to Validated Patterns SIG with Anthony Herr and Michael St-Jean Wednesday, July 3rd, 2024 https://commons.openshift.org/join
OpenShift Commons: Validated Patterns SIG, Deploying RAG LLM with Validated Patterns
OpenShift Commons: Validated Patterns SIG to talk about Deploying RAG LLM with Validated Patterns with Saurabh Agarwal, Red Hat and Swati Kale, Red Hat. https://validatedpatterns.io/patterns/rag-llm-gitops/ Tuesday, June 25, 2024 https://validatedpatterns.io/ Join us every other Tuesday at 12 Eastern: https://red.ht/commons-vp-sig View upcoming sessions: http://red.ht/commons-vpsig-agenda https://commons.openshift.org/join
OpenShift Commons: Edge SIG: Computer vision at the edge: Single Node OpenShift installation
OpenShift Commons: Edge SIG Computer vision at the edge: Single Node OpenShift installation with Diego Alvarez (Red Hat) Tuesday, June 18, 2024 https://commons.openshift.org/join
OpenShift Commons: Konveyor AI Overview and Demo with Ramon Roman and Syed M. Shaaf Red Hat
OpenShift Commons: Konveyor AI Overview and Demo with Ramon Roman Red Hat and Syed M. Shaaf Red Hat Wednesday, June 12, 2024 https://commons.openshift.org/join
El poder de la IA está en el código abierto (The power of AI is open)
El rol de Red Hat siempre ha sido ser catalistas dentro de las comunidades de código abierto. Maria Bracho, CTO para América Latina de Red Hat, y Myriam Fentanes, Principal Technical Product Manager de Red Hat OpenShift Al, analizan los esfuerzos para llevar la innovación en IA a todos. (Red Hat's role has always been to be catalysts within open source communities. Maria Bracho, CTO for Latin America at Red Hat, and Myriam Fentanes, Principal Technical Product Manager at Red Hat OpenShift Al, discuss efforts to bring AI innovation to everyone.)
What does Postgres have to do with AI?
Crunchy Data's Blair McDuff discusses the partnership between Crunchy Data and Red Hat, highlighting how to use PostgreSQL to harness the power of AI when making business decisions.
AI at Red Hat Summit
Take a quick tour of what was shown at Red Hat Central area of Red Hat Summit spanning from the interactive AI activation center, image generation, Retrieval Augmented Generation (RAG) patterns, knowledge-based chatbots and AI-assisted farming. Some of our partners like Dell and Neural Magic are also highlighted. Experience the fun. Learn more: https://red.ht/openshift_ai
How do we prevent AI hallucinations
Taneem Ibrahim, Engineering Manager, and Daniele Zonca, Senior Principal Software Engineer, for Red Hat OpenShift AI discuss a common problem with Foundation models, hallucinations and some evolving methods to try to limit these challenges. Learn more: https://red.ht/openshift_ai
Embarking on your personal AI journey
Adel Zaalouk, Principal Technical Product Manager at Red Hat, shares his personal experience getting involved with AI beginning with product discovery in product management. His journey has led him to involvement with the working group in the CNCF involvement for Cloud Native and AI resulting in a whitepaper, “Cloud Native Artificial Intelligence.” link: https://www.cncf.io/reports/cloud-native-artificial-intelligence-whitepaper Learn more: https://commons.openshift.org
Using Red Hat Enterprise Linux as a foundation for AI
Gunnar Hellekson, VP and GM for Red Hat Enterprise Linux, describes some of the capabilities that Red Hat Enterprise Linux provides as a foundation layer for AI workloads. Using the same standardized platform that leading enterprise companies are already deploying, RHEL provides capabilities like accelerated GPU hardware enablement and bundled prerequisite libraries needed for AI work across the datacenter, public cloud, or even edge. Learn more: https://red.ht/openshift_ai
Using OpenShift for AI models
Chuck Dubuque, Senior Director of Marketing at Red Hat’s Hybrid Platforms BU, describes some of the benefits of using Red Hat OpenShift as an underlying platform for tuning and building models, bringing these models to application developers and bringing these AI-enabled applications into production. Learn more: https://red.ht/openshift_ai
Consistent platforms in the changing world of AI
AI isn’t going away. OpenShift AI helps customers solve some of their challenges by operationalizing both traditional and generative AI projects. Why should customers think of Red Hat for AI? Steven Huels, GM for Red Hat’s AI Business Unit, describes how OpenShift AI provides a consistent and flexible AI platform for an ever-changing ecosystem as AI continues to evolve. Learn more: https://red.ht/openshift_ai
Navigate across the MLOps journey
Businesses are struggling with getting value faster with AI. Getting access to compute resources without ringing up huge cloud bills and picking a core AI platform that helps navigate across the MLOps lifecycle are two pieces of advice that Alex Corvin, Engineering Manager in Red Hat’s AI business unit, provides. Learn more: https://red.ht/openshift_ai
Bridging the gap between MLOps and DevOps with OpenShift AI
The use of generative AI to create meaningful services that help businesses and people has accelerated over the last year. OpenShift AI bridges the gap between data scientists who are creating the models in the MLOps world and the application developers creating applications in the DevOps world. Learn more: https://red.ht/openshift_ai
Red Hat OpenShift AI: the basics
Taneem Ibrahim, Senior Manager of Engineering for Red Hat OpenShift AI, provides a short description of OpenShift AI and discusses the challenges of LLM adoption and how the platform simplifies and scales access to GPUs for data scientists. Learn more: http://red.ht/openshift_ai
Demo: NVIDIA AI Enterprise with Red Hat OpenShift
This demo showcases an AI application built on Red Hat OpenShift AI with Red Hat partner NVIDIA, simulating an AI system that generates a project proposal for a named customer.
Red Hat OpenShift AI Demo
In this video, Chris Chase demonstrates a typical workflow that includes creating a project, launching a Jupyter notebook with appropriate cluster resources and training a foundation model from Hugging Face with one’s own data. Once the model is fine-tuned, Chris automates the build using a data science pipeline and serves the model for use in an AI-enabled application.
Paladin Cloud CTO Steve Hull on AI, cloud security, and open source
It's been a year and a half since Paladin Cloud launched a cloud security management platform and prioritization engine. Steve Hull, Co-founder and CTO of Paladin Cloud, discusses the product's key features and how AI has reshaped the security landscape.
AI 101: What’s the difference between AI and ML?
Welcome to the AI 101 series where we demystify the basics of Artificial Intelligence (AI) to help you better understand basic AI concepts. We hope these videos help you start to apply AI to your organization’s strategy. Join us as we explore the foundations of AI. In our debut video, Prasanth Anbalagan (Senior Principal Technical Marketing Manager for AI) will define and explore the basic concepts of Artificial Intelligence vs. Machine Learning (AI vs. ML). These terms are often confused and we want to help you better understand what each one is (and isn’t) as you move along in your AI journey. Prasanth compares and contrasts the technologies and provides clear examples of how they work. In the comments section, please let us know future topics we should cover in AI 101. Learn more at RedHat.com
OpenShift Commons Gathering Chicago 2023 - Case Study: OpenShift AI at NASA
Carlos Costa (IBM), Hongchao Deng (Anyscale), and Alex Corvin (Red Hat) present at the OpenShift Commons Gathering Co-Located with KubeCon + CloudNativeCon North America 2023. Slides: https://speakerdeck.com/openshift_commons/c7df541cf8b772a643a3ea3150c2117c
OpenShift Commons Raleigh - Panel: Secure Software Supply Chain
Anne Marie Fred (Red Hat), Andrew Block (Red Hat), Rob Szumski (EdgeBit)
OpenShift Commons Raleigh - AI Partner Panel with Dell, IBM and Neural Magic
Frank La Vigne (Red Hat)Steven Huels (Red Hat)Raj Shirsolkar (Dell)Brad Topol (IBM)Jay Marshall (Neural Magic)
OpenShift Commons Gathering, Raleigh - Crunchy Data Lightning Talk: AI for Postgres
Bob Pacheco (Crunchy Data) explores these key questions: What does postgres have to do with AI? Why postgres at all?
OpenShift Commons Gathering, Raleigh - Will open source continue to dominate in the age of AI?
Jeremy Eder (Red Hat) explores fundamental questions amid the recent surge in AI adoption: What is open source AI, why is open AI important, and how will the industry arrive at a consensus?
OpenShift Commons Gathering, Raleigh - Ansible Lightspeed: The Evolution of Artificial Intelligence
Matt Jones (Red Hat) speaks about Ansible Lightspeed and the potential of generative AI to transform enterprise automation.
OpenShift Commons Gathering, Raleigh - Demos: Computer Vision and Reinforcement Learning
Frank Lavigne (Red Hat) demos computer vision and reinforcement learning on Red Hat OpenShift at the OpenShift Commons 2023 Gathering in Raleigh, North Carolina.
OpenShift Commons Gathering, Raleigh - Artificial Intelligence at Red Hat: What's Next
AI has embedded itself in every industry and every executive is looking at how to apply it to their space. Sherard Griffin (Red Hat) discusses how to leverage operational models and get them into production. Griffin introduces Red Hat OpenShift AI, generative AI for the open hybrid cloud.
Simplifying the training and deployment of large foundation models
Being able to quickly scale up or down OpenShift cluster resources becomes more critical as the fine-tuning of foundation models gains popularity. Red Hat and IBM Research have worked together to open source a generative AI infrastructure stack that has been provided in Open Data Hub and is being productized as part of OpenShift AI. In this demo, Mustafa Eyceoz shows how you can use Ray, CodeFlare, and Multi-Cluster App Dispatcher (MCAD) technology to prioritize real-time access to cluster resources or schedule workloads for batch processing.
Red Hat and NVIDIA Partner to Bring AI to the Enterprise
How do you take generative AI and bring it into the enterprise sphere where it has all the support, all the ecosystem engagement, and all the accelerated infrastructure required to run it? NVIDIA's Amanda Saunders discusses how Red Hat and NVIDIA are partnering to make this possible.
Unleashing AI for every organization with Red Hat OpenShift and NVIDIA
As more organizations start integrating AI applications in their business to deliver better insights and more value to their customers, they are also faced with the complexity of integrating these AI solutions with existing infrastructures, scaling them and keeping them accurate. Red Hat and NVIDIA have partnered to unlock the power of AI for every business by delivering an end-to-end enterprise platform optimized for AI workloads. Kick-start your AI journey with free access to NVIDIA AI Enterprise with Red Hat OpenShift. Visit our hands-on lab and try it today: https://www.nvidia.com/red-hat-launchpad
SWIFT and Red Hat OpenShift deliver an AI platform for financial transaction intelligence at scale
SWIFT is shaping the future of payments and securities to be faster, smarter and better. As the financial industry’s neutral and trusted provider, we help our community of over 11,000 financial institutions move value around the world reliably and more securely at scale. SWIFT is now leveraging our pivotal role in the financial industry to develop transformative AI solutions, enabled by a high-performance AI platform that is future ready for hybrid cloud. Join this keynote to learn how SWIFT, along with Red Hat, C3.ai, Kove and partner financial institutions, is embarking on a journey to enhance the effectiveness and efficiency of shared services like payment screening and anomaly detection, leveraging unique global transaction data – without compromising the integrity of transaction data or the privacy of its users. Speakers: Marius Bogoevici (Red Hat) and Chalapathy Neti (SWIFT)
Starburst partners with Red Hat on OpenShift Data Science for AI/ML workloads
Justin Borgman, CEO of Starburst, describes how Starburst partners with Red Hat, integrating with Red Hat OpenShift Data Science to provide insights into an organization’s data for analytics and AI/ML workloads. Justin describes how the Starburst data mesh capabilities can help data engineers to access decentralized data and eliminate the need for data wrangling.
Top 5 Considerations for an AI/ML Platform
Will McGrath, Product Marketing Manager in Red Hat’s Data Services business unit, discusses the top 5 considerations for building out an AI platform. Whether it’s developing a data strategy or building a collaborative environment, these guidelines will help your organization achieve success as your machine learning projects move from experimentation to production.
What is Kubeflow?
In this short 45 second video, Juana Nakfour describes Kubeflow and its advantages.
Using Open Data Hub for MLOps Demo
Juana Nakfour describes MLOps and demonstrates how to use Open Data Hub, a community project integrating open source AI/ML tools in an end-to-end AI platform on Red Hat OpenShift. First, Juana shows using the JupyterHub Elyra tool to auto-generate and run an MLOps KubeFlow pipeline. After showcasing Red Hat Ceph Storage and bucket notification to trigger and automate the MLOps pipeline, Juana shows how to deploy the fraud detection neural network machine learning model using KFServing and using Canary rollout to introduce a new version of the model.
How NTT and Red Hat have Built an Edge Offer to Deliver New Cloud-Native AI-Platform Services
Learn how cloud-native AI applications can be enabled for a new digital service edge platform from NTT East. The Multi-Access Edge Computing (MEC) platform is based on a Red Hat OpenShift architecture leveraging GPUs and NVIDIA GPU-Operator to deliver new AI applications services across multi-cloud edge environments including private 5G, fiber, telco edge and customer’s enterprise edge. The cloud-native edge solution is for multiple industry solutions and a next generation intelligent video analytics (IVA) example is highlighted for a retailer.
Red Hat OpenShift - Moonshot ideas
Innovation without limitation. Learn more at OpenShift.com
Delivering MLOps with OpenShift GitOps and Pipelines
As AI/ML modeling becomes more important the growing expection is to adopt and implement an MLOps strategy to make it easier to productize models and keep them up to date. Explore the challenges this poses for Data Scientists and what OpenShift is doing to make productizing AI/ML models more simple with features like OpenShift GitOps, OpenShift Pipelines, Quay, and Red Hat OpenShift Data Science.
NVIDIA and Red Hat enabling faster delivery of AI-powered intelligent applications
Together, Red Hat’s OpenShift hybrid cloud platform and NVIDIA’s innovative GPUs, CUDA-X libraries, GPU Operator and AI software from the NGC catalog helps businesses quickly, consistently, and securely develop, deploy, and scale AI applications across the hybrid cloud. Learn more at https://www.OpenShift.com/nvidia
Red Hat Innovators in the Open | HCA Healthcare
HCA Healthcare uses an innovative data platform to accelerate detection of sepsis to save lives by using a containers ML application. Learn more at https://www.OpenShift.com/ai-ml
Red Hat’s AI/ML Technology Partnerships
Red Hat has taken a partner ecosystem approach to AI/ML use cases in a hybrid cloud environment, not only for the Open Data Hub project but also its Red Hat OpenShift Data Science managed cloud services offering. In this short video, Ryan describes the approach with these key data science ISV partnerships.
Preview demo of Red Hat OpenShift Data Science
Updated video showcasing the latest features is available at: red.ht/ai_demo See a preview demo of the Red Hat OpenShift Data Science managed cloud service offering. Red Hat OpenShift Data Science combines common open source tooling, partner software and other Red Hat portfolio software to provide a fully supported sandbox in which to rapidly develop, train, and test ML models in the public cloud. Chris Chase runs through a tutorial of how easy it is to launch Jupyter notebooks to build a model in a TensorFlow framework and deploy the model in a container-ready format.
Red Hat's AI vision and Open Data Hub (part 2)
As a proof point of Red Hat’s AI investments and vision, Steven highlights the origin of the Open Data Hub project, its use by Red Hat Consulting services to help customers put models into production and the ecosystem of AI/ML technology partners on Red Hat OpenShift. Learn more at: openshift.com/learn/topics/ai-ml
Red Hat's Artificial Intelligence (AI) Vision (part 1)
Steven Huels describes Red Hat’s AI/ML business focus and vision, including our investments in Kubernetes and DevOps as well as the technology partner ecosystem. Through the Open Data Hub project, Red Hat provides a reference for implementing open source tooling on OpenShift. Learn more at openshift.com/learn/topics/ai-ml
PICT Voices #25: Ivan Oransky, The Use and Abuse of Science
Our twenty-fifth interview is with Ivan Oransky, science journalist (New York City, USA) by Kristof K.P. Vanhoutte (Basel, Switzerland) Friday, February 12, 2021 PICT Voices is an interview series conducted by PICT faculty with notable members of the broader PICT community. Our goal is to present our community with a variety of voices across the spectrum of the humanities and critical, creative thinking. To achieve this, we will interview a broad spectrum of thinkers ranging from scholars to journalists. website: https://parisinstitute.org/ soundcloud: https://soundcloud.com/parisinstitute instagram: https://www.instagram.com/parisinstitute/ facebook: https://www.facebook.com/parisinstitute.org/ twitter: https://twitter.com/ParisCritical linkedin: https://www.linkedin.com/school/paris-institute-for-critical-thinking/
A Customer’s Story with Red Hat AI Solutions
Red Hat Consulting Services describes how they helped a customer solve problems of multiple siloed data sets, accelerating and scaling data science workflows across thousands of users, and implementing open source AI/ML technologies leveraging the Open Data Hub project. Results include: speeding up their ML deployments (faster time to solution) and increased collaboration between developers and Ops. Learn more at https://www.openshift.com/learn/topics/ai-ml
Using Open Data Hub as a Red Hat Data Scientist
Isabel Zimmerman demonstrates how to build, deploy and monitor machine learning models using the Open Data Hub project for simplified end-to-end machine learning workflows. Isabel shows an ML workflow that includes Jupyter notebooks, Seldon for model hosting, Prometheus and Grafana for data visualization. Learn more at openshift.com/ai-ml
A Day in the Life of a Red Hat Data Scientist
Don Chesworth describes what it's like to be a data scientist at Red Hat, including the use the latest and greatest open source tooling. Don also shares an experience working with the Open Data Hub team to submit improvements for changing Red Hat OpenShift shared memory size across multiple GPUs. Learn more at openshift.com/ai-ml
Using MPI Operator for GPU-Accelerated Workloads with Lustre FS David Gray Red Hat | NVIDIA GTC OSCG
David Gray (Red Hat) NVIDIA GTC OpenShift Commons Gathering October 9, 2020 High performance computing workloads increasingly rely on containers that make applications easier to manage, preserve their dependencies, and add portability across different environments. Red Hat OpenShift Container Platform is an enterprise-ready Kubernetes-based platform for deploying containerized applications on shared compute resources. An Operator is a method of packaging, deploying, and managing a Kubernetes-native application that can make it easier to run complex workloads. We'll demonstrate how GPU-accelerated scientific applications can be deployed on OpenShift, using message-passing interface and backed by the Lustre file system for data storage.
NVIDIA GTC Cory Latchkowski ExxonMobil
NVIDIA GTC Michael Bennett Dell Diane Feddema Red Hat
NVIDIA GTC Buck Woody MSFT
The Enterprise Neurosystem Initiative Bill Wright (Red Hat) | NVIDIA GTC OSCG
The Enterprise Neurosystem Initiative Bill Wright (Red Hat) NVIDIA GTC OpenShift Commons Gathering October 9, 2020
Applying AIOps to Kubernetes Telemetry Data with Open Data Hub at OpenShift | NVIDIA GTC OSCG
Applying AIOps to Kubernetes Telemetry Data with OpenDataHub Alex Corvin and Ivan Necas (Red Hat) NVIDIA GTC OpenShift Commons Gathering 2020 October 9 2020
AIOps vs MLOps vs DevOps Zak Berrie (Red Hat) | NVIDIA GTC OSCG
AIOps vs MLOps vs DevOps Zak Berrie (Red Hat) NVIDIA GTC OpenShift Commons Gathering October 9, 2020
AIOps: Anamoly Detection (Marcel Hild)
Red Hat uses techniques such as anamoly detection to identify issues in infrastructure and proactively addresses them to make the products even better. Learn more: openshift.com/ai-ml and openshift.com/storage
AI/ML at the edge with Red Hat OpenShift
Red Hat OpenShift simplifies the deployment and life-cycle management of AI-powered intelligent applications at the edge, just like it does in the cloud. Learn more at: openshift.com/edge
Red Hat Research: University Collaboration
Hugh Brock, Research Director at Red Hat, discusses how Red Hat connects University research with Red Hat engineers and ultimately driving innovation upstream. Learn more: https://research.redhat.com/
Intelligent Data Summit: Fast Track AI from Pilot to Production with a Kubernetes-powered platform
Red Hat's Abhinav Joshi presents "Fast track AI from Pilot to Production with a Kubernetes-powered platform" From: Intelligent Data Summit: Integration Developer News.
Ask Me Anything on OpenDataHub with Landon LaSmith Red Hat and ODH team members
AMA on OpenDataHub with Landon LaSmith Red Hat 07 20 2020
Using AI to Solve Problems
Michael Clifford, a data scientist at Red Hat, discusses how Red Hat uses Artificial Intelligence to solve operational problems and make the company's products better. Learn more: openshift.com/ai-ml
Top Considerations for Accelerating AI/ML Lifecycle in the Cloud-Native Era
Red Hat's Abhinav Joshi presents "Top Considerations for Accelerating AI/ML Lifecycle in the Cloud-Native Era" at Cloud Architecture Summit: News: Integration Developer News. To learn more about OpenShift: visit: OpenShift.com To download the slides, visit: https://www.idevnews.com/registration?event_id=506&code=ws_sidebar
OpenShift Commons Briefing: Continuous Development and Deployment of AI/ML Models with Kubernetes
OpenShift Commons Briefing Continuous Development &Deployment of AI/ML Models with containers and Kubernetes Guest Speakers: Will Benton (Red Hat) Parag Dave (Red Hat) Peter Brey (Red Hat) hosted by Diane Mueller (Red Hat) 2020-06-04
Open Data Hub Introduction
Introduction to Open Data Hub an open source project that provides an end to end AI/ML platform on Openshift. Please visit opendatahub.io
ML Workflows on Red Hat OpenShift
Red Hat believes that machine learning (ML) workflows are just like traditional software development workflows. In this video we demonstrate how Red Hat OpenShift Container Platform can enable data scientists to leverage traditional DevOps methodologies to accelerate their ML workflows.
Harness the power of AI/ML with Red Hat OpenShift
In this webinar, Abhinav Joshi shares how you can harness the power of AI/ML with the Red Hat portfolio. He covers the challenges, potential, and Containers and Kubernetes for AI/ML workloads. Abhinav is the Product Marketing Lead for Artificial Intelligence / Machine Learning on OpenShift. For more information, please visit: openshift.com/ai-ml
Why deploy AI/ML (Artificial Intelligence & Machine Learning) workloads on OpenShift?
While organizations are turning to Artificial Intelligence and Machine Learning (AI/ML) to better serve customers, reduce cost, and gain other competitive advantages, there are significant challenges to executing these programs. Data Scientists need a self-service experience that allows them to build, scale, and share their machine learning (ML) modeling results across the hybrid cloud. With Red Hat OpenShift, you'll enable data scientists to easily enable and deploy their ML modeling without the dependency on IT to provision infrastructure. Learn more at openshift.com/ai-ml
Uploading data to Ceph via command line
In this tutorial we show you how to store data in an S3 Data Lake via the command line using the s3cmd tool.
Installing Open Data Hub on OpenShift 4.1
Tutorial demonstrating how to deploy Open Data Hub in an Openshift 4.1 cluster using the Operator Hub catalog. At the end of this tutorial, you will be able to deploy a JupyterHub server, spark cluster and create your own Jupyter notebook. OpenShift 4.1 cluster was created using Code Ready Containers(https://code-ready.github.io/crc/ ) For more information on Open Data Hub, visit http://opendatahub.io/
Fraud Detection Using Open Data Hub on Openshift
Demonstration of an end-to-end AI/ML fraud detection use case using Open Data Hub on Openshift. For more information on Open Data Hub, visit http://opendatahub.io/
Event Overview
This OpenShift Commons Gathering on AI and Machine Learning is co-located with NVIDIA's GTC virtual event on October 5–9, 2020!
The OpenShift Commons Gatherings bring together experts from all over the world to discuss container technologies, best practices for cloud native application developers and the open source software projects that underpin the OpenShift ecosystem. This event will gather developers, data scientists, devops professionals and sysadmins together to explore the next steps in making container technologies successful and secure for your ML and AI workloads.
Where
Virtual Gathering at NVIDIA's GTC
Virtual at NVIDIA's GTC
When
Friday, October 9, 2020
Price
Included in your GTC Registration
Please note: Pre-registration is required. Your GTC registration grants you access to the virtual on-demand OpenShift Commons Gathering talks on October 5-9 from 9 a.m. - 1 p.m. ET. This forum will feature a discussion of best practices, lessons learned, and the open source projects that support OpenShift and Kubernetes—all from project leads with production enterprise deployments.
Schedule
Code of Conduct: We follow the Code of Conduct of other events such as KubeCon + CloudNativeCon. Similarly we are dedicated to providing a harassment-free experience for participants at all of our events, whether they are held in person or virtually. All event participants, whether they are attending an in-person event or a virtual event, are expected to behave in accordance with professional standards, with both this Code of Conduct as well as their respective employer's policies governing appropriate workplace behavior and applicable laws.
COVID-19 Health + Safety Information: We are committed to our attendee's health and safety and follow the Healthy and Safety policies of the events we are co-located with or by default those of the CNCF.
See sessions from previous gatherings
- 9:00 AM
- The Enterprise Neurosystem Initiative William Wright (Red Hat)
- 9:30 AM
- AIOps vs MLOps vs DevOps Zak Berrie (Red Hat)
- 10:00 AM
- GPU-Accelerated Machine Learning with OpenShift Container Platform Diane Feddema (Red Hat)Michael Bennett (Dell)
- 10:30 AM
- Using MPI operator to run GPU-accelerated scientific workloads on Red Hat OpenShift with Lustre FS David Gray (Red Hat)
- 11:00 AM
- Using GPUs for Data Science & Optimization Containers in OpenShift Cory Latschkowski (ExxonMobil)
- 11:30 AM
- Applying AIOps to Kubernetes Telemetry Data with Open Data Hub at OpenShift Alex Corvin (Red Hat)Ivan Necas (Red Hat)
- 12:00 PM
- Accelerating AI on the Edge Nick Barcet (Red Hat)Kevin Jones (NVIDIA)
- 12:31 PM
- Data driven insights with SQL Server Big Data Clusters and OpenShift Buck Woody (Microsoft)
Speakers
Venue
October 9, 2020 | 9:00 AM–8:00 PM EDT
Virtual at NVIDIA's GTC
Virtual Gathering at NVIDIA's GTC