Vinoo Ganesh

Speaker, Technologist, and Startup Advisor

Accelerating Data Evaluation

Data + Ai Summit 2021

As the data-as-a-service ecosystem continues to evolve, data brokers are faced with an unprecedented challenge – demonstrating the value of their data. Successfully crafting and selling a compelling data product relies on a broker’s ability to differentiate their product from the rest of the market. In smaller or static datasets, measures like row count and cardinality can speak volumes. However, when datasets are in the terabytes or petabytes though – differentiation becomes much difficult.

Guaranteeing pipeline SLAs and data quality standards with Databand

Airflow Summit 2021

We’ve all heard the phrase “data is the new oil.” But really imagine a world where this analogy is more real, where problems in the flow of data - delays, low quality, high volatility - could bring down whole economies? When data is the new oil with people and businesses similarly reliant on it, how do you avoid the fires, spills, and crises? As data products become central to companies’ bottom line, data engineering teams need to create higher standards for the availability, completeness, and fidelity of their data.

Strata Data Superstream Series: Creating Data-Intensive Applications

O'Reilly Strata

As the scale of data continues to grow (alongside an ever expanding ecosystem of tools to work with it), developing successful applications is an increasingly challenging proposition—and a necessity. At each stage of the process, from architecting to processing and storing data to deployment, there are a range of aspects to consider. Things like scalability, consistency, reliability, efficiency, and maintainability. It can be hard to figure out the right way forward.

Large Scale Data Analytics with Vinoo Ganesh

Data Standard

In this episode of The Data Standard, Catherine Tao and Vinoo Ganash talk about large-scale data and data processing challenges. Vinoo starts the conversation by explaining his current obligations and how his company uses data to find working solutions for a wide range of problems. Then he talks about OLTP and OLAP models and how large-scale data can help improve workflows and offer better results. Optimization is needed for every specific application, and Vinoo talks about the methods he uses to enhance existing platforms.

The Apache Spark File Format Ecosystem

Spark Summit 2020

In a world where compute is paramount, it is all too easy to overlook the importance of storage and IO in the performance and optimization of Spark jobs. In reality, the choice of file format has drastic implications to everything from the ongoing stability to compute cost of compute jobs. These file formats also employ a number of optimization techniques to minimize data exchange, permit predicate pushdown, and prune unnecessary partitions.