Tabsdata 1.6.0 is out 🚀
New release brings a revamped UI plus:
• KafkaTriggers for real-time ETL
• CronTriggers for scheduled workflows
• Data Quality API to validate & quarantine bad data
Excited to see what this unlocks for you!
Try now with: `pip install tabsdata --upgrade`
We’re improving the Tabsdata UI every day. In our latest release, triggering a function now displays an execution graph that shows your workflow processing in realtime. Super helpful for tracking how long each step takes and pinpointing exactly where errors occur.
🚀 New Blog! 🎯
Integrate Salesforce reports with Snowflake using Tabsdata.
tabsdata.com/blog/hands-o...
Meetup Alert 🚨
Are you a data engineer in the Bay Area? Join us at our Data Engineering Meetup. Come and engage with fellow practitioners, thought leaders, and enthusiasts to share insights and spark meaningful discussions.
Tue, Dec 9th@6pm, 870 Market St - SF
Details: luma.com/vrhtcyhj
Successfully concluded the data engineering meetup 2.0 in the Bay Area. Kudos to @shubham-gupta.bsky.social for putting it together.
Checkout the link for more details:
www.linkedin.com/feed/update/...
🚀 Our Fourth Tutorial is Live! 🎯
What we cover:
🔹 Registering a Tabsdata publisher to read, filter, and publish data from Google Sheet.
🔹 Registering a subscriber to export the data to Neon PostgreSQL.
🔹 Auto-updating the exported data when new data is published.
🚀 Our Third Tutorial is Live! 🎯
What we cover:
🔹 Registering a Tabsdata publisher to read, filter, and publish a CSV file from local.
🔹 Registering a subscriber to export the data to AWS as Iceberg table.
🔹 Auto-updating the exported data when new data is published.
github.com/tabsdata/tut...
🚀 Our Second Tutorial is Live! 🎯
What we cover:
🔹 Registering a Tabsdata publisher to read, filter, and publish from PostgreSQL.
🔹 Registering a subscriber to filter and export the data to PostgreSQL.
🔹 Auto-updating the exported data when new data is published.
github.com/tabsdata/tut...
Read more to understand how Pub/Sub for Tables enables the creation of data products that last.
tabsdata.com/blog/part-3-...
#Tabsdata #PubSubforTables #DataIntegration #DataPipelines #DataStrategy #DataProducts
I recently had the opportunity to attend a large data & analytics conference where I learned about a new company called @tabsdata.com and their novel approach to data integration through a capability they’re calling Pub/Sub for Tables. I wrote about it on LinkedIn.
www.linkedin.com/pulse/pubsub...
"What exactly is Pub/Sub for Tables?"
We wrote an article to answer exactly that. 👇
tabsdata.com/blog/what-is...
#Tabsdata #PubSubforTables #DataIntegration #DataPipelines #DataStrategy #DataProducts
There’s a better way to integrate data — and it starts with tables, not jobs.
Here’s how Tabsdata replaces fragile pipelines and gives data teams clarity, autonomy, and time back. 👇
tabsdata.com/blog/part-1-...
#Tabsdata #PubSubforTables #DataIntegration #DataPipelines #DataStrategy #DataProducts
What if… pipelines are the reason your data strategy is stuck?
Here’s how Pub/Sub for Tables flips the model and fixes the pipeline mess 👇
🔗 tabsdata.com/blog/how-pub...
#Tabsdata #PubSubforTables #DataIntegration #DataPipelines #DataStrategy #DataContracts #DataProducts
🚀 Our First Tutorial is Live! 🎯
What we cover:
🔹 Registering a Tabsdata publisher to read, filter, and publish CSV data.
🔹 Registering a subscriber to export the data as JSON.
🔹 Auto-updating the exported file when new data is published.
Try it out below! 🚀
github.com/tabsdata/tut...
Exciting News! Tabsdata is heading to the Gartner Data & Analytics Summit from March 3 - 5 at Gaylord Palms Resort, Kissimmee, Florida! Visit us at Booth #108.
Comment below if you are around. We would love to catch up!
#GartnerDA #Tabsdata #DataIntegration #DataEngineering #DataManagement
Introducing Tabsdata: Pub/Sub for Tables
We’re officially out of stealth mode! 🚀 Tabsdata is now available for everyone to explore.
Check out our full announcement on LinkedIn: www.linkedin.com/posts/arvind...
#Tabsdata #DataIntegration #DataEngineering #DataManagement #DataPipelines #ETL