Building Time Series Foundational Models: Past, Present and Future
Hellow guys, Welcome to my website, and you are watching Building Time Series Foundational Models: Past, Present and Future. and this vIdeo is uploaded by Data Phoenix Events at 2024-07-18T22:57:16-07:00. We are pramote this video only for entertainment and educational perpose only. So, I hop you like our website.
Info About This Video
Name |
Building Time Series Foundational Models: Past, Present and Future |
Video Uploader |
Video From Data Phoenix Events |
Upload Date |
This Video Uploaded At 19-07-2024 05:57:16 |
Video Discription |
Ask your questions using Slido: https://app.sli.do/event/w5FGmT7Gj3yXqvMpqb1K5Q
Join the Data Phoenix Discord community: https://discord.gg/tVyu7CJ8nm
===========================
Time series data is ubiquitous across industries: the startup COO predicts customer demand; a clinician in the ICU reads medical charts, the stock broker forecasts security prices. In the past, a technical and domain expert would build, train, and implement a new model for each task, in each industry's swim lane. This is a massive intellectual fragmentation bottleneck!
Luckily, transformer architectures, enabling zero-shot sequence modeling across modalities, are a perfect solution. We introduce a new frontier in transformer modalities - time series - where massive amounts of domain knowledge are taught to large time series models (LTSMs), forming a universal prior across forecasting, imputation, classification, and anomaly detection tasks.
Join us as we review the next frontier of AI, showcasing Gradient’s LTSM, a novel architecture, and massive time series dataset, achieving state of the art performance on time series tasks. Our foundational model and datasets are fully open sourced. Finally, we preview multimodal foundational time series models, where working with time series data is as easy as prompting ChatGPT.
Key Highlights of the Webinar:
- Cross-Industry Time Series Analysis: Learn how time series data is used in diverse fields—from retail for demand forecasting, in healthcare for monitoring ICU patients, to finance for predicting stock movements.
- Introduction to Transformer Modalities in Time Series: Discover the transformative application of transformer architectures to time series data and how these models can perform zero-shot learning across different types of time series tasks to form perfect solutions.
- State-of-the-Art Time Series Models: Get an in-depth look at Gradient’s LTSM (Large Time Series Model), a novel architecture, and massive time series dataset, achieving state of the art performance on time series tasks
- Preview of Multimodal Foundational Models: Preview upcoming advancements in multimodal foundational models that promise to simplify working with time series data, making it as easy as prompting ChatGPT.
Speaker
Leo is a Chief Scientist at Gradient leading research and analytics, a full stack AI platform that enables businesses to build customized agents to power enterprise workload. Prior to Gradient, Leo led CloudTruck's ML and data science orgs pioneering applied ML to operational challenges. Before that, Leo held leadership roles across Opendoor, Optimizely, and Disney. Leo holds a bachelor's degree in economics from Stanford, as well as a masters and PhD in statistics from Stanford. |
Category |
Science & Technology |
Tags |
Science & Technology Download MP4 | Science & Technology Download MP3 | Science & Technology Download MP4 360p | Science & Technology Download MP4 480p | Science & Technology Download MP4 720p | Science & Technology Download MP4 1080p |
More Videos