Releases: Canner/WrenAI
Releases Β· Canner/WrenAI
0.7.4
Features
- Add gpt-4o-mini by @cyyeh in #541
- Implement the process for the evaluation framework by @paopa in #525
Fixes and Chores
- Bump requests from 2.31.0 to 2.32.2 in /wren-ai-service in the pip group across 1 directory by @dependabot in #530
- Some function refactor for evaluation by @cyyeh in #536
- Update kustomizations based on new version of wren-ai-service by @cyyeh in #514
Full Changelog: 0.7.3...0.7.4
0.7.3
Fixes and Chores
- Evaluation dataset curation app by @cyyeh in #398
- Pull ollama models if not exist automatically by @cyyeh in #512
- Optimize the PyTest and Image release CI by @paopa in #508
- Fix ibis-service related envs in the env.dev.example by @paopa in #517
- Bump setuptools from 69.5.1 to 70.0.0 in /wren-ai-service in the pip group across 1 directory by @dependabot in #519
Full Changelog: 0.7.2...0.7.3
0.7.2
Fixes and Chores
- Multiple table selection enabled after search. #396 by @himanshu634 in #491
- Fix unit tests and add some test cases for ibis adapter by @onlyjackfrost in #503
- Remove logs and instruction by @onlyjackfrost in #521 & #523
Maintaince
- Update ai-service README for easier contributing from communities by @cyyeh in #515
- Add wren engine as a submodule by @grieve54706 in #507
New Contributors
- @himanshu634 made their first contribution in #491
- @grieve54706 made their first contribution in #507
Full Changelog: 0.7.1...0.7.2
0.7.1
Fixes and Chores
- Refine wren-ai-service logging and setups by @cyyeh in #486
- Integrate Langfuse SDK to represent the evaluation result (#395) by @paopa in #495
- Fix trigger method for
force_deploy.py
by @cyyeh in #500 - Fix dry run endpoint when in
wren_ibis
mode by @cyyeh in #501 - Update python dependency by @cyyeh in #490
- Some minor update for ai-service by @cyyeh in #493 and #497
Full Changelog: 0.7.0...0.7.1
0.7.1-rc.1
What's Changed
- update python dependency by @cyyeh in #490
- refine wren-ai-service logging and setups by @cyyeh in #486
- minor update for ai-service by @cyyeh in #493
- feature(wren-ai-service): integrate Langfuse SDK to represent the evaluation result (#395) by @paopa in #495
- Chore/ai service/minor update by @cyyeh in #497
Full Changelog: 0.7.0...0.7.1-rc.1
0.7.0
New Features
- Support using LLM and embedding models using different providers, such as LLM using Ollama, embedding models using OpenAI by @cyyeh in #454
- Support ClickHouse data source by @onlyjackfrost in #481
Fixes and Chores
- Fix issue #474 by @andreashimin in #479 & by @cyyeh in #480
- Remove pipeline "query understanding" from Wren AI Service by @cyyeh in #463
- Refator schema change feature by @fredalai in #450
- Reflect the adjustment of Wren Engine v0.6.0 in Wren UI by @onlyjackfrost in #472
Maintaince
- Update bug report template by @cyyeh in #478 & #477
- Refine words in README.md by @chilijung in #473
- Add a comment in launcher by @onlyjackfrost in #476
- Fix typo and add embedding models providers info by @cyyeh in #462
- Update Next.js version and other library for security consideration by @andreashimin in #465, by @dependabot in #466, by @dependabot in #393, and by @dependabot in #354
Full Changelog: 0.6.0...0.7.0
0.7.0-rc.1
What's Changed
- fix typo and add embedding models providers info by @cyyeh in #462
- chore(deps): bump next from 13.5.6 to 14.1.1 in /wren-ui by @dependabot in #354
- chore(deps): bump the npm_and_yarn group across 1 directory with 2 updates by @dependabot in #393
- chore(deps): bump next from 14.1.1 to 14.2.4 in /wren-ui in the npm_and_yarn group across 1 directory by @dependabot in #466
- fix(wren-ui): Issues Encountered with Antd after Upgrading NextJS from 13 to 14 by @andreashimin in #465
- remove query understanding by @cyyeh in #463
- Update README.md by @chilijung in #473
- separate llm and embedder by @cyyeh in #454
- Chore: add comment by @onlyjackfrost in #476
- chore(wren-ui): Improve schema change by @fredalai in #450
- update instructions for container logs in issue template by @cyyeh in #477
- fix(wren-ui): Add quote to SQL in preview model data API by @andreashimin in #479
- add llm info by @cyyeh in #478
- add quotes for the fallback option when sql breakdown fails by @cyyeh in #480
- Chore: some update to reflect the adjustment of Wren Engine v0.6.0 by @onlyjackfrost in #472
- Feature: Support ClickHouse data source by @onlyjackfrost in #481
- Release 0.7.0-rc.1 by @onlyjackfrost in #483
Full Changelog: 0.6.0...0.7.0-rc.1
0.6.0
New Features
- Support custom LLMs through Ollama and OpenAI compatible APIs by @cyyeh in #376 and #457
- Support MS SQL Server by @onlyjackfrost in #443
- Enhance schema change feature and detect more affected semantic layer resources by @fredalai in #442
Fixes and Chores
- Implement Engine Adapter in ai service by @paopa in #440
- Fix some minor bugs in ai service by @cyyeh in #445, @cyyeh in #449 and @cyyeh in #451
- Improve the logging mechanism in ai service by @cyyeh in #447
- Support force deploy in wren ui by @onlyjackfrost in #452
- Overwrite docker compose file if using custom llm by @onlyjackfrost in #458
Maintenance and Documentation
- Change our official naming from
WrenAI
toWren AI
by @chilijung in #441 - Add log collecting instruction in bug report template by @onlyjackfrost in #438
- Add blog post link in README.md by @chilijung in #456
Notes and known issues:
- Wren AI now supports using customized LLM. To prevent users from deleting Qdrant container and volume by themselves, we clear and reinitialize the vector database every time you start/restart the ai service. The reinitialize process might have some chance of failing and you might have to do the "deploy" action again
- Wren AI now supports MS SQL Server as a data source, but there are some issues related to Ibis and Sqlglot and users might have issues when creating calculated fields. Canner/wren-engine#632
Full Changelog: 0.5.0...0.6.0
0.6.0-rc.2
What's Changed
- fix launcher by @cyyeh in #451
- fix: support force deploy by @onlyjackfrost in #452
- Update README.md by @chilijung in #456
- Chore: override compose file if using custom llm by @onlyjackfrost in #458
- Chore/ai service/auto deploy at init by @cyyeh in #457
- release 0.6.0-rc.2 by @onlyjackfrost in #460
Full Changelog: 0.6.0-rc.1...0.6.0-rc.2
0.6.0-rc.1
What's Changed
- Chore: add collecting log instruction in bug report template by @onlyjackfrost in #438
- Rename wren ai by @chilijung in #441
- feature(wren-ai-service): implement Engine Adapter to validate queries by @paopa in #440
- minor update for supporting 3rd party Open AI APIs + add Ollama by @cyyeh in #376
- Feature: Support MS SQL Server data source by @onlyjackfrost in #443
- fix wren-ai-service 0.6.0 issues by @cyyeh in #445
- feat(wren-ui): Support schema change phase 2 by @fredalai in #442
- change logger.error to logger.exception by @cyyeh in #447
- release 0.6.0-rc.1 by @onlyjackfrost in #448
- fix missing env by @cyyeh in #449
Notes
- Wren AI now supports using customized LLM. To prevent users from deleting Qdrant container and volume by themselves, we clear the vector database at the first time you migrate to this version. Users will have to do the "deploy" action to reinitialize the vector database (click the deploy button on the top right of modeling page )
- Wren AI now supports MS SQL Server as a data source, but there are some issues and users might have issues when creating calculated fields.
- Docker Desktop needs to be at least >= 4.27.0 since we need to support this syntax in docker-compose.yaml:
env_file:
- path: ./default.env
required: true # default
- path: ./override.env
required: false
Full Changelog: 0.5.0...0.6.0-rc.1