IBM-OSTİMTECH Workshop

About

Many CEOs, COO’s, CTOs, and other decision-makers are seeing an advantage from the rise of big data and faster computing power. The question arises: How can all this great data drive innovation?  Being able to harness the power of data through data science can help to accelerate financial, sales, manufacturing, and supply-chain operations; enable a better, more intimate customer experience; or reduce downtime.
Enterprises are facing challenges accurately forecasting demand and shipping costs in their supply chain. Inaccurate forecasting can lead to widespread disruptions. By leveraging the latest developments in both time series forecasting and decision optimization, enterprises can better harness the power of their own data and use scalable methods to generate forecasts and plan logistics for a large portfolio of products quickly and more effectively. Predictive and preventative models can be applied to mitigate downtime, offset losses, and enable safer working environments.
AI and machine learning also can help drive better, faster products to market. For example, when launching a new product or service, it can be imperative to use data analytics to gain insight into market, demand, and target demographics. In this two-days workshop, attendees will be leveraging a cloud-based analytics platform to execute several labs. Included activities will be accessing and curating data, using automated AI modeling tools, working with supply chain decision optimization systems, and more advanced modeling capabilities.
Lab 1: Catalogs & Projects
You will play the role of two members of the advanced analytics team: Data Steward and Data Engineer.  As the Data Steward your task is to create secure, permissioned access to sensitive data repositories.  The Data Engineer will then begin the process of gathering data for analysis.  At the conclusion of the lab your project will be primed for the next phase – building pipelines and cleaning data.
Objectives
Create and populate a Knowledge Catalog with connections and assets Create and configure a Project on Cloud Pak for Data Explore the relationship between Project and Catalog spaces
Lab 2: Data Engineering – Data Refinery
You set up a Data Catalog to index the assets, and a project to begin working with those assets.  In the lab you will continue your role as the Data Engineer by building data pipelines and creating clean data sets for analysis.
Objectives
Prepare a stream output for advanced analytics using Data Refinery.
Publish the prepared data to the catalog.
Lab 3: Data Science and Machine Learning
Having established the pipeline and prepared a cleansed dataset as a Data Engineer, it is time to switch roles to the Data Scientist and begin analyzing the data.  You will use the insights from this analysis to create a data set that can be used to train predictive models.  After training the model you will deploy it using the Watson Machine Learning service and configure performance monitoring to ensure that the model accuracy doesn’t decay over time.
Objectives
Analyze the captured sensor data using open source tools and programming.
Create a data asset that can be used to train predictive models.
Train, deploy, and monitor the performance of a predictive model.
Publish new assets to the catalog.
Lab 4: Build a Supply Chain Optimization Application
You will be exploring all the analytical roles that were covered in the past labs. Here, you will implement the entire pipeline of roles and tasks needed to provide a real-world decision optimization solution. The final product is a simple descriptive dashboard to be utilized by the client; in this case, a warehouse owner.From data administrator to UI developer, we will oversee the intake of data from multiple sources, wrangle data for model preparation, run an optimization model notebook, and plug the optimization solution into a dashboard user interface.
Objectives 
Import data from a governed data catalog.
Create a prescriptive project to build a decision optimization application.
Run a CPLEX Optimization model.
Create and run a R Shiny application.