Skip to content

Releases: mage-ai/mage-ai

0.9.38 | Goosebumps 😱 🎃

25 Oct 00:24
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

🧑‍💻 Side-by-side block view

Your Mage development workflow just got a whole lot more efficient. Starting today, you can view blocks side-by-side for twice the editing power! Check this one out to improve your DevEx and make more data magic! Simply click the "side-by-side" icon in the center of the editor to get started!


by @tommydangerous in #3804

🧱 Support for dbt-dremio


Dremio users, rejoice! You can now execute dbt models in you lakehouse thanks to support for the dbt-dremio package.

by @dy46 in #3760

👨🏻‍💼 Support Github Enterprise authentication


An often-requested feature, Mage now supports Github Enterprise authentication! 💥

by @dy46 in #3817

🤖 Support auto termination in workspace

Mage now supports auto-termination checks, which will run once every sixty seconds. This can be used to auto start/stop k8s workspaces. Configure it when creating your workspace to get started!

Screenshot 2023-10-24 at 7 22 26 PM

by @dy46 in #3721 and #3751

⏰ Show server time in app

This PR adds a new current time display in the top right of the app header. By default, the time display shows in the UTC format, but if you click on the time display, a dropdown menu shows up:



Nice! Frontend UX improvements coming in clutch!

by @anniexcheng in #3785

🐛 Bug Fixes

💅 Enhancements & Polish

😎 New Contributors

Full Changelog: 0.9.35...0.9.38

0.9.35 | 🤹‍♂️ Loki

17 Oct 22:09
63ac8e4
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

🙇‍♂️ The Great Pipeline Unification

Perhaps not as momentus as The Second Great Unification, this unification is much more useful for data pipelines! Data Integration sources & destinations can now be added as blocks in batch pipelines! 🤯

What does this mean? Using Mage, you can now perform integration (extract), transformation, and loading in the SAME pipeline using Singer sources and your favorite tools (dbt, Python, SQL)! Read more in our docs here.

This is like having Fivetran/Airbyte, dbt, and a jupyter notebook all-in-one WITH engineering best practices built-in!

👩🏻‍💻 Interactions - a no-code UI for configuring data pipelines

Another huge update, Mage now let's you build templates to unlock data at scale. You can configure data pipelines that are fully customizable for stakeholders and consumers. Read more about interactions here and get started building today!

This functionality will go a long way for democratizing data pipelines and easing data workloads!

🤓 Granular API Permissions

Mage now supports granular API permissions on ANY action. Each permission can grant read and write operations on specific resources (e.g. API endpoints). One or more roles can be assigned to 1 or multiple users.

What does this mean? You can create permissions for your team at the most granular level possible. Mage is now completely governable for ANY action. Read more here.

🥳 NEW MongoDB CDC Streaming Source

Mage now supports MongoDB CDC Streaming Sources! A big thanks to @emincanoguz11 for the contribution!

by @emincanoguz11 in #3716

🐛 Bug Fixes

💅 Enhancements & Polish

New Contributors

Full Changelog: 0.9.34...0.9.35

0.9.34 | C-3PO

11 Oct 21:50
ed9a1a4
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

🧠 Add support for dbt-synapse

This is going to "dbt" amazing! Mage now supports the dbt-synapse library, allowing dbt to be executed against the Azure Synapse Data Warehouse. You can read more on the package here. Excited to see our Azure users make use of this one!

@dy46 in #3657

☁️ Added Google Cloud Pubsub as a sink for streaming pipeline

A big shout out to @pammusankolli, who recently added Google Cloud Pubsub as a sink for data streaming pipelines! If you're not familiar with PubSub, you can read more here. Be sure to check out the docs in Mage to build your next pipeline!

@pammusankolli in #3689

🤓 Support HTML tags in Markdown blocks

All of our Markdown enthusiasts will appreciate this one! Previously, Mage's markdown blocks only supported images via this format:

![](https://images.photowall.com/products/57215/golden-retriever-puppy.jpg?h=699&q=85)

Now, Mage supports <img> elements with custom sizes in Markdown blocks by providing width and height attributes like so:

<img src="https://images.photowall.com/products/57215/golden-retriever-puppy.jpg?h=699&q=85" alt="drawing" width="200"/>

Here are some examples:

Nice!

@anniexcheng in #3692

🔍 Added Opensearch Destination

Mage now supports OpenSearch as a destination in data integration pipelines! 🥳

@Luishfs in #3719

🐛 Bug Fixes

💅 Enhancements & Polish

New Contributors

Full Changelog: 0.9.30...0.9.34

0.9.30 | Cowboy Bebop 🤠🚀

03 Oct 23:57
d7de886
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

🌊 Streaming: Base Class Overhaul + 8 New Destinations

This. is. huge. With a complete base class re-write, every IO destination is now a streaming destination.

That means you can stream to:

And any future destinations added as an IO base. Huge shutout to @wangxiaoyou1993 on the herculean effort!

by @wangxiaoyou1993 in #3623

👀 Recently viewed pipelines

Some frontend polish now allows you to see your Recently Viewed pipelines from the overview page— a nice touch!

by @tommydangerous in #3633

👨‍👩‍👧‍👦 Community: GCS Sensors

@pammusankolli deliver's his first contribution by adding a Google Cloud Storage sensor to check if a file exists in a given bucket! Thanks for the add— this will be super useful to our Google platform users!

by @pammusankolli in #3651

🧩 Support syncing Git submodules

A solid improvement to our Git Sync functionality, you can now sync submodules, too! Just check the Git Sync settings to enable the feature.

by @dy46 in #3593

🔂 Add always on schedule interval

On user request, we've added an @always_on interval for scheduled triggers. Always on schedules will trigger the new pipeline run as soon as the latest pipeline run is completed.

Once a pipeline run ends, regardless of whether or not it failed or succeeded, it will start a new run. Let us know if you find that valuable!

by @dy46 in #3611

🐛 Bug Fixes

💅 Enhancements & Polish

New Contributors

Full Changelog: 0.9.28...0.9.30

0.9.28 | The Creator 🤖

27 Sep 00:45
7022f4b
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

🫨 Brand new dbt blocks!

One of our top contributors @christopherscholz just delivered a huge feature! A completely streamlined dbt Block!

Here are some of the highlights:

  • Directly integrated into dbt-core, instead of calling it via a subprocess, which allows to use all of dbts functionalities
  • Use dbt to seed output dataframes from upstream blocks
  • Use dbt to generate correct relations e.g. default schema names, which differ between databases
  • Use dbt to preview models, by backporting the dbt seed command to dbt-core==1.4.7
  • No use of any mage based database connections to handle the block
    • Allows to install any dbt adapter, which supports the dbt-core version
  • Moved all code into a single interface called DBTBlock
    • Doubles as a factory for child blocks DBTBlockSQL and DBTBlockYAML
    • Child blocks make it easier to understand which block does what

There's lots to unpack in this one, so be sure to read more in the PR below and check out our updated docs.

by @christopherscholz in #3497

➕ Add GCS storage to store block output variables

Google Cloud users rejoice! Mage already supports storing block output variables in S3, but thanks to contributor @luizarvo, you can now do the same in GCS!

Check out the PR for more details and read-up on implementation here.

by @luizarvo in #3597

✨ Tableau Data Integration Source

Another community-led integration! Thank you @mohamad-balouza for adding a Tableau source for data integration pipelines!

by @mohamad-balouza in #3581

🦆 Add DuckDB loader and exporter templates

Last week, we rolled out a ton of new DuckDB functionality, this week, we're adding DuckDB loader and exporter templates! Be sure to check them out when building your new DuckDB pipelines! 😄

image image

by @matrixstone in #3553

🧱 Bulk retry incomplete block runs

Exciting frontend improvements are coming your way! You can now retry all of a pipeline's incomplete block runs from the UI. This includes all block runs that do not have completed status.

🐛 Bug Fixes

💅 Enhancements & Polish

New Contributors

Full Changelog: 0.9.26...0.9.28

0.9.26 | Expend4bles 💥

20 Sep 00:42
0195532
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

🐣 DuckDB IO Class and SQL Block

Folks, we've got some ducking magic going on in this release. You can now use DuckDB files inside Mage's SQL Blocks. 🥳 🦆 🪄

You can use data loaders to CREATE and SELECT from DuckDB tables as well as write new data to DuckDB.

Check out our docs to get started today!

by @matrixstone in #3463

📊 Charts 2.0

This is another huge feature— a complete overhaul of our Charts functionality!

There are 2 new charts dashboards: a dashboard for all your pipelines and a dashboard for each pipeline.

You can add charts of various types with different sources of data and use these dashboards for observability or for analytics.

There's a ton to unpack here, so be sure to read more in our docs.

by @tommydangerous

⏰ Local timezone setting

This one is a big quality of life improvement: Mage can now display datetimes in local timezones... No more UTC conversion! Just navigate to Settings > Workspace > Preferences to enable a new timezone!

by @johnson-mage in #3481

🔁 Influxdb data loader

A big shoutout to @mazieb! You can now stream data from InfluxDB via Mage. Thanks for your hard work! They added a destination last week!

Read more in our docs here.

by @mazieb in #3430

🎚️ Support for custom logging levels

Another frequently requested feature shipping this week, courtesy of @dy46: custom block-level logging!

You can now specify logging at the block-level by directly changing the logger settings:

@data_loader
def load_data(*args, **kwargs):
    kwarg_logger = kwargs.get('logger')

    kwarg_logger.info('Test logger info')
    kwarg_logger.warning('Test logger warning')
    kwarg_logger.error('Test logger error')

    ...

See more in our docs here.

by @dy46 in #3473

🐛 Bug Fixes

💅 Enhancements & Polish

New Contributors

Full Changelog: 0.9.23...0.9.26

0.9.23 | One Piece 🏴‍☠️

19 Sep 17:19
a5d98fe
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

✨ Add & update block variables through UI

📰 Hot off the press, you can now add and update block variables via the Mage UI!

image image

Check out our docs to learn more about block variables.

by @johnson-mage in #3451

✨ PowerBI source [Data Integration]

You can now pull PowerBI data in to all your Mage projects!

A big shoutout to @mohamad-balouza for this awesome contribution. 🎉

Read more about the connection here.

by @mohamad-balouza in #3433

✨ Knowi source [Data Integration]

@mohamad-balouza was hard at work! You can also integrate Knowi data in your Mage projects! 🤯

Read more about the connection here.

by @mohamad-balouza in #3446

✨ InfluxDB Destination [Streaming]

A big shoutout to @mazieb! You can now stream data to InfluxDB via Mage. Thanks for your hard work!

Read more in our docs here.

by @mazieb in #3378

🐛 Bug Fixes

💅 Enhancements & Polish

New Contributors

Full Changelog: 0.9.21...0.9.23

0.9.21 | Ahsoka 🥷🏻

06 Sep 01:09
65d65a9
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

✨ Single-task ECS pipeline executor 🤖

Mage now supports running the whole pipeline process in one AWS ECS task instead of running pipeline blocks in separate ECS tasks! This allows you to speed up pipeline execution in ECS tasks by saving ECS task startup time.

Here's an example pipeline metadata.yaml:

blocks:
- ...
- ...
executor_type: ecs
run_pipeline_in_one_process: true
name: example_pipeline
...

The ECS executor_config can also be configured at the pipeline level.

by @wangxiaoyou1993 in #3418

✨ Postgres streaming destination 🐘

Postgres enthusiasts rejoice! You can now stream data directly to Postgres via streaming pipelines! 😳

Check out the docs for more information on this handy new destination.

by @wangxiaoyou1993 in #3423

✨ Added sorting to the Block Runs table

You can now sort the Block Runs table by clicking on the column headers! Those of us who are passionate about having our ducks in a row are happy about this one! 🦆

by @johnson-mage in #3356

✨ Enable deleting individual runs from pipeline runs table

Bothered by that one run you'd rather forget? Individual runs can be dropped from the pipeline runs table, so you don't have to worry about them anymore!

by @johnson-mage in #3370

✨ Add timeout for block runs and pipeline runs

Much like Buzz Lightyear, we're headed "to infinity and beyond," but we get that your pipelines shouldn't be. This feature allows you to configure timeouts for both blocks and pipelines— if a run exceeds the timeout, it will be marked as failed.

image image

by @dy46 in #3399

🐛 Bug Fixes

  • Fix nextjs local build type error by @johnson-mage in #3389
  • Fixed NULL headers breaking API Source by @Luishfs in #3386
  • Fixes on MongoDB destination (Data integration) by @Luishfs in #3388
  • Fix Git commit from version control by @dy46 in #3397
  • Change the default search path to the schema given in the connection URL for PostgreSQL by @csharplus in #3406
  • Raise exception if a pipeline has duplicate block uuids by @dy46 in #3385
  • Fix creating subfolder when renaming block by @wangxiaoyou1993 in #3419
  • Docker: Installation Method of Node using setup_17.x is no longer supported by @christopherscholz in #3405
  • [dy] Add support for key properties for postgresql destination by @dy46 in #3422

💅 Enhancements & Polish

New Contributors

Full Changelog: 0.9.19...0.9.21

0.9.19 | The Equalizer ⌚

30 Aug 01:08
bde10d1
Compare
Choose a tag to compare

What's Changed

🎉 Exciting New Features

✨ New AI Functionality

As a part of this release, we have some exciting new AI functionality: you can now generate pipelines and add inline comments using AI. 🤯

Generate entire pipelines using AI

Add inline comments using AI

See the following links for documentation of the new functionality.

  • Inline comment guide here
  • Generative pipeline guide here

by @tommydangerous in #3365 and #3359

✨ Elasticsearch as sink for streaming

Elasiticsearch is now available as a streaming sink. 🥳🎉

A big thanks to @sujiplr for their contribution!

by @sujiplr in #3335

✨ Add GitHub source

The GitHub API is now available as a data integrations— you can pull in commits, changes, and more from the GitHub API!

by @mattppal in #3252

✨ Interpolate variables in dbt profiles

This is a big one for our dbt users out there! You can now use {{ variables('...') }} in dbt profiles.yml.

jaffle_shop:
  outputs:
    dev:
      dbname: postgres
      host: host.docker.internal
      port: 5432
      schema: {{ variables('dbt_schema') }}
  target: dev

That means pulling in custom Mage variables, directly!

by @tommydangerous in #3337

✨ Enable sorting on pipelines dashboard

Some great frontend improvements are going down! You can now sort pipelines on the dashboards, both with and without groups/filters enabled!

by @johnson-mage in #3327

✨ Display data integration source docs inline

Another awesome community contribution— this one also on the frontend. Thanks to @splatcollision, we now have inline documentation for our data integration sources!

Now, you can see exactly what you need, directly from the UI!

by @splatcollision in #3349

🐛 Bug Fixes

💅 Enhancements & Polish

✨ New Docs Structure

You might notice our docs have a new look! We've changed how we think about side-navs and tabs.

Our goal is to help you find what you need, faster. We hope you like it!

_by @mattppal in #3324 and #3367

✨ Other Enhancements

New Contributors

Full Changelog: 0.9.16...0.9.19

0.9.16 | Gran Turismo 🏁

23 Aug 00:13
49481c5
Compare
Choose a tag to compare

Gran Turismo

What's Changed

🎉 Exciting New Features

✨ Global data products are now available in Mage! 🎉

A data product is any piece of data created by 1 or more blocks in a pipeline. For example, a block can create a data product that is an in-memory DataFrame, or a JSON serializable data structure, or a table in a database.

A global data product is a data product that can be referenced and used in any pipeline across the entire project. A global data product is entered into the global registry (global_data_products.yaml) under a unique ID (UUID) and it references an existing pipeline. Learn more here.

by @tommydangerous in #3206

✨ Add block templates for MSSQL 🤘

We now have some awesome block templates for our MySQL users out there!

Check them out:

image image

by @wangxiaoyou1993 in #3294

✨ Support sharing memory objects across blocks

In the metadata.yml of a standard batch pipeline, you can now configure running pipelines in a single process:

blocks:
...
run_pipeline_in_one_process: true
...

You may now also:

  • Define object once and make it available in any block in the pipeline via keyword arguments kwargs['context']
  • Pass variables between blocks in memory directly

by @wangxiaoyou1993 in #3280

🐛 Bug Fixes

💅 Enhancements & Polish

New Contributors

Full Changelog: 0.9.14...0.9.16