diff --git a/.dockerignore b/.dockerignore new file mode 100644 index 0000000..2fb20a0 --- /dev/null +++ b/.dockerignore @@ -0,0 +1,5 @@ +* +!crates/ +!Cargo.toml +!Cargo.lock +.env \ No newline at end of file diff --git a/.github/CODE_OF_CONDUCT.md b/.github/CODE_OF_CONDUCT.md new file mode 100644 index 0000000..3eb39a8 --- /dev/null +++ b/.github/CODE_OF_CONDUCT.md @@ -0,0 +1,73 @@ +# Contributor Covenant Code of Conduct + +## Our Pledge + +In the interest of fostering an open and welcoming environment, we as +contributors and maintainers pledge to making participation in our project and +our community a harassment-free experience for everyone, regardless of age, body +size, disability, ethnicity, gender identity and expression, level of experience, +nationality, personal appearance, race, religion, or sexual identity and +orientation. + +## Our Standards + +Examples of behavior that contributes to creating a positive environment +include: + +* Using welcoming and inclusive language +* Being respectful of differing viewpoints and experiences +* Gracefully accepting constructive criticism +* Focusing on what is best for the community +* Showing empathy towards other community members + +Examples of unacceptable behavior by participants include: + +* The use of sexualized language or imagery and unwelcome sexual attention or + advances +* Trolling, insulting/derogatory comments, and personal or political attacks +* Public or private harassment +* Publishing others' private information, such as a physical or electronic + address, without explicit permission +* Other conduct which could reasonably be considered inappropriate in a + professional setting + +## Our Responsibilities + +Project maintainers are responsible for clarifying the standards of acceptable +behavior and are expected to take appropriate and fair corrective action in +response to any instances of unacceptable behavior. + +Project maintainers have the right and responsibility to remove, edit, or +reject comments, commits, code, wiki edits, issues, and other contributions +that are not aligned to this Code of Conduct, or to ban temporarily or +permanently any contributor for other behaviors that they deem inappropriate, +threatening, offensive, or harmful. + +## Scope + +This Code of Conduct applies both within project spaces and in public spaces +when an individual is representing the project or its community. Examples of +representing a project or community include using an official project e-mail +address, posting via an official social media account, or acting as an appointed +representative at an online or offline event. Representation of a project may be +further defined and clarified by project maintainers. + +## Enforcement + +Instances of abusive, harassing, or otherwise unacceptable behavior may be +reported by contacting us via oss@maif.fr. All +complaints will be reviewed and investigated and will result in a response that +is deemed necessary and appropriate to the circumstances. The project team is +obligated to maintain confidentiality with regard to the reporter of an incident. +Further details of specific enforcement policies may be posted separately. + +Project maintainers who do not follow or enforce the Code of Conduct in good +faith may face temporary or permanent repercussions as determined by other +members of the project's leadership. + +## Attribution + +This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, +available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html + +[homepage]: https://www.contributor-covenant.org \ No newline at end of file diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md new file mode 100644 index 0000000..53eb772 --- /dev/null +++ b/.github/CONTRIBUTING.md @@ -0,0 +1,85 @@ +# Contributing to Yōzefu + +These guidelines apply to all projects living in the the `MAIF/yozefu` repository. + +These guidelines are meant to be a living document that should be changed and adapted as needed. +We encourage changes that make it easier to achieve our goals in an efficient way. + +## Codebase + +* [lib](https://github.com/MAIF/yozefu/crates/lib): contains the core structures and the search module. +* [app](https://github.com/MAIF/yozefu/crates/app): contains the kafka consumer with utility functions. +* [tui](https://github.com/MAIF/yozefu/crates/tui): the glue code between `app` and Ratatui. +* [wasm-types](https://github.com/MAIF/yozefu/crates/wasm-types): type definitions for defining WebAssembly modules. +* [command](https://github.com/MAIF/yozefu/crates/command): commands for the CLI. +* [bin](https://github.com/MAIF/yozefu/crates/bin): the binary. + +## Workflow + +The steps below describe how to get a patch into a main development branch (e.g. `main`). +The steps are exactly the same for everyone involved in the project (be it core team, or first time contributor). +We follow the standard GitHub [fork & pull](https://help.github.com/articles/using-pull-requests/#fork--pull) approach to pull requests. Just fork the official repo, develop in a branch, and submit a PR! + +1. To avoid duplicated effort, it might be good to check the [issue tracker](https://github.com/MAIF/yozefu/issues) and [existing pull requests](https://github.com/MAIF/yozefu/pulls) for existing work. + - If there is no ticket yet, feel free to [create one](https://github.com/MAIF/yozefu/issues/new) to discuss the problem and the approach you want to take to solve it. +2. [Fork the project](https://github.com/MAIF/yozefu#fork-destination-box) on GitHub. You'll need to create a feature-branch for your work on your fork, as this way you'll be able to submit a pull request against the mainline branch. +3. Create a branch on your fork and work on the feature. For example: `git checkout -b wip-awesome-new-feature` + - Please make sure to follow the general quality guidelines (specified below) when developing your patch. + - Please write additional tests covering your feature and adjust existing ones if needed before submitting your pull request. +4. Once your feature is complete, prepare the commit with a good commit message, for example: `Adding nice feature #42` (note the reference to the ticket it aimed to resolve). +5. Now it's finally time to [submit the pull request](https://help.github.com/articles/using-pull-requests)! + - Please make sure to include a reference to the issue you're solving *in the comment* for the Pull Request, this will cause the PR to be linked properly with the Issue. Examples of good phrases for this are: "Resolves #1234" or "Refs #1234". +6. Now both committers and interested people will review your code. This process is to ensure the code we merge is of the best possible quality, and that no silly mistakes slip through. You're expected to follow-up these comments by adding new commits to the same branch. The commit messages of those commits can be more loose, for example: `Removed debugging using printline`, as they all will be squashed into one commit before merging into the main branch. + - The community and team are really nice people, so don't be afraid to ask follow up questions if you didn't understand some comment, or would like clarification on how to continue with a given feature. We're here to help, so feel free to ask and discuss any kind of questions you might have during review! +7. After the review you should fix the issues as needed (pushing a new commit for new review etc.), iterating until the reviewers give their thumbs up-which is signalled usually by a comment saying `LGTM`, which means "Looks Good To Me". +8. If the code change needs to be applied to other branches as well (for example a bugfix needing to be backported to a previous version), one of the team will either ask you to submit a PR with the same commit to the old branch, or do this for you. +9. Once everything is said and done, your pull request gets merged. You've made it! + +The TL;DR; of the above very precise workflow version is: + +1. Fork yozefu +2. Hack and test on your feature (on a branch) +3. Document it +4. Submit a PR +6. Keep polishing it until received thumbs up from the core team +7. Profit! + +## External dependencies + +All the external runtime dependencies for the project, including transitive dependencies, must have an open source license that is equal to, or compatible with, [Apache 2](http://www.apache.org/licenses/LICENSE-2.0). + +This must be ensured by manually verifying the license for all the dependencies for the project: + +1. Whenever a committer to the project changes a version of a dependency (including Scala) in the build file. +2. Whenever a committer to the project adds a new dependency. +3. Whenever a new release is cut (public or private for a customer). + +Which licenses are compatible with Apache 2 are defined in [this doc](http://www.apache.org/legal/3party.html#category-a), where you can see that the licenses that are listed under `Category A` are automatically compatible with Apache 2, while the ones listed under `Category B` need additional action: + +> Each license in this category requires some degree of [reciprocity](http://www.apache.org/legal/3party.html#define-reciprocal); therefore, additional action must be taken in order to minimize the chance that a user of an Apache product will create a derivative work of a reciprocally-licensed portion of an Apache product without being aware of the applicable requirements. + +Each project must also create and maintain a list of all dependencies and their licenses, including all their transitive dependencies. This can be done either in the documentation or in the build file next to each dependency. + + +## Getting started + +The easiest way to create a dev workspace is to run the following commands: + +```bash +bash docs/try-it.sh "Nantes" "json" "public-french-addresses-json" +bash docs/try-it.sh "Narbonne" "jsonSchema" "public-french-addresses-json-schema" +bash docs/try-it.sh "Niort" "avro" "public-french-addresses-avro" +bash docs/try-it.sh "Nancy" "text" "public-french-addresses-text" +bash docs/try-it.sh "Nimes" "malformed" "public-french-addresses-malformed" +cargo run -- -c localhost +``` + +`try-it.sh` is a script booting a kafka instance and a schema registry with docker. Plus, it publishes json data to a default topic. + +## Tests + +Every new feature should provide corresponding tests to ensure everything is working and will still working in future releases. To run the tests, just run + +```sh +cargo test --all-features +``` \ No newline at end of file diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md new file mode 100644 index 0000000..26cc173 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -0,0 +1,29 @@ +--- +name: Bug report +about: Create a bug report +title: '' +labels: '' +assignees: '' + +--- +**Describe the bug** +A clear and concise description of what the bug is. + +**To Reproduce** +Steps to reproduce the behavior: +1. Run `--topics my-topics 'from begin'` +2. Select the kafka record X... +3. See error + +**Expected behavior** +A clear and concise description of what you expected to happen. + +**Screenshots** +If applicable, add screenshots to help explain your problem. + +**version:** +[e.g. v0.1.0] + +**Additional context** +Additional information that may be relevant to the issue. +[e.g. architecture, OS] \ No newline at end of file diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md new file mode 100644 index 0000000..e69de29 diff --git a/.github/dependabot.yml b/.github/dependabot.yml new file mode 100644 index 0000000..dee3eeb --- /dev/null +++ b/.github/dependabot.yml @@ -0,0 +1,11 @@ +version: 2 +updates: + - package-ecosystem: "cargo" + directory: "/" + schedule: + interval: "weekly" + + - package-ecosystem: "github-actions" + directory: "/" + schedule: + interval: "monthly" \ No newline at end of file diff --git a/.github/workflows/actionlint.yml b/.github/workflows/actionlint.yml new file mode 100644 index 0000000..941bf36 --- /dev/null +++ b/.github/workflows/actionlint.yml @@ -0,0 +1,20 @@ +name: Actionlint + +on: + workflow_dispatch: + push: + paths: + - .github/workflows/** + +concurrency: + group: ${{ github.workflow }}-${{ github.ref }} + cancel-in-progress: true + +jobs: + actionlint: + runs-on: ubuntu-latest + steps: + - name: Checkout + uses: actions/checkout@v4 + - name: actionlint + uses: raven-actions/actionlint@v2.0.0 diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml new file mode 100644 index 0000000..192398f --- /dev/null +++ b/.github/workflows/build.yml @@ -0,0 +1,196 @@ +name: Build + +on: + schedule: + - cron: "0 0 1 * *" + push: + paths-ignore: + - changelog + branches-ignore: + - main + workflow_dispatch: + +env: + CARGO_TERM_COLOR: always + RUST_BACKTRACE: 1 + RUSTFLAGS: "-D warnings" + RUSTDOCFLAGS: '--deny warnings' + MINIMUM_SUPPORTED_RUST_VERSION: 1.80.1 + +jobs: + check: + name: Check + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + - uses: Swatinem/rust-cache@v2 + - name: Run cargo check + run: cargo check --all-features --locked --release --all + env: + CARGO_NET_GIT_FETCH_WITH_CLI: true + + build: + name: Build w/o features + needs: check + runs-on: ubuntu-latest + strategy: + matrix: + rust: + - stable + - beta + - nightly + steps: + - uses: actions/checkout@v4 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: ${{ matrix.rust }} + - uses: Swatinem/rust-cache@v2 + - name: Run cargo build + run: cargo build + + build-for-targets: + name: Build for targets + needs: check + runs-on: ${{ matrix.platforms.os }} + strategy: + matrix: + platforms: + - os: macOS-latest + target: aarch64-apple-darwin + features: "--features ssl-vendored" + - os: macos-latest-large + target: x86_64-apple-darwin + features: "--features ssl-vendored" + - os: ubuntu-latest + target: x86_64-unknown-linux-gnu + features: "--features ssl-vendored" + - os: ubuntu-latest + target: aarch64-unknown-linux-gnu + features: "--features ssl-vendored" + - os: windows-latest + target: x86_64-pc-windows-gnu + features: "" + - os: windows-latest + target: x86_64-pc-windows-msvc + features: "--features ssl-vendored" + + steps: + - uses: actions/checkout@v4 + - uses: houseabsolute/actions-rust-cross@v0.0.17 + # if: (github.event.pull_request.base.ref == 'main' && matrix.platforms.os == 'macos-latest-large') == false + with: + target: ${{ matrix.platforms.target }} + args: "--locked ${{ matrix.platforms.features }}" + strip: true + + clippy: + needs: check + name: Clippy + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + components: clippy + - uses: Swatinem/rust-cache@v2 + - name: Run cargo clippy + run: cargo clippy --all-targets --all-features -- --deny warnings + + format: + needs: check + name: Format + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + components: rustfmt + - uses: Swatinem/rust-cache@v2 + - name: Run cargo fmt + run: cargo fmt --all -- --check + + unused-dependencies: + needs: check + name: Unused dependencies + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: Swatinem/rust-cache@v2 + - uses: bnjbvr/cargo-machete@main + + tests: + needs: check + name: Tests + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + - uses: Swatinem/rust-cache@v2 + - run: cargo test + env: + CI: "true" + + doc: + needs: check + name: Documentation + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + - uses: Swatinem/rust-cache@v2 + - name: Build documentation + run: cargo doc --no-deps --document-private-items --verbose + + lychee: + name: Lychee + runs-on: ubuntu-latest + needs: [check] + steps: + - uses: actions/checkout@v4 + - uses: lycheeverse/lychee-action@v2 + name: Link Checker + # https://github.com/lycheeverse/lychee/issues/1405 + with: + args: --exclude-loopback README.md './crates/bin/README.md' './crates/app/README.md' './crates/command/README.md' './crates/lib/README.md' './crates/wasm-types/README.md' './crates/bin/src/**' './crates/app/src/**' './crates/command/src/**' './crates/lib/src/**' './crates/wasm-types/src/**' './docs/*.md' --exclude-path ./docs/url-templates/README.md --exclude-path ./docs/schema-registry/README.md --exclude 'https://docs.rs' + + # https://docs.github.com/en/actions/use-cases-and-examples/publishing-packages/publishing-docker-images + docker: + name: Docker image + runs-on: ubuntu-latest + needs: [check] + permissions: + contents: read + packages: write + attestations: write + id-token: write + steps: + - name: Checkout + uses: actions/checkout@v4 + - name: Set up Docker Buildx + uses: docker/setup-buildx-action@v3 + - name: Build and push + uses: docker/build-push-action@v6 + id: push + with: + push: false + cache-from: type=gha + cache-to: type=gha,mode=max + + typos: + name: Typos + runs-on: ubuntu-latest + needs: [check] + steps: + - name: Checkout Actions Repository + uses: actions/checkout@v4 + - name: Check spelling of the project + uses: crate-ci/typos@master \ No newline at end of file diff --git a/.github/workflows/code-coverage.yml b/.github/workflows/code-coverage.yml new file mode 100644 index 0000000..081c77d --- /dev/null +++ b/.github/workflows/code-coverage.yml @@ -0,0 +1,39 @@ +name: Code coverage + +on: + pull_request: + branches: + - main + paths-ignore: + - CHANGELOG.md + workflow_dispatch: + +env: + CARGO_TERM_COLOR: always + RUST_BACKTRACE: 1 + RUSTFLAGS: "-D warnings" + RUSTDOCFLAGS: '--deny warnings' + MINIMUM_SUPPORTED_RUST_VERSION: 1.80.1 + +concurrency: + group: ${{ github.workflow }}-${{ github.ref }} + cancel-in-progress: true + +jobs: + code-coverage: + name: Code coverage + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + - uses: Swatinem/rust-cache@v2 + - name: Install cargo-tarpaulin + run: cargo install cargo-tarpaulin + - name: Run cargo tarpaulin + run: cargo tarpaulin --all-features --out Xml + - name: Upload coverage reports to Codecov + uses: codecov/codecov-action@v5 + env: + CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }} \ No newline at end of file diff --git a/.github/workflows/developer-experience.yml b/.github/workflows/developer-experience.yml new file mode 100644 index 0000000..0d556d0 --- /dev/null +++ b/.github/workflows/developer-experience.yml @@ -0,0 +1,76 @@ +name: Developer Experience + +env: + CARGO_TERM_COLOR: always + RUST_BACKTRACE: 1 + RUSTFLAGS: "-D warnings" + RUSTDOCFLAGS: '--deny warnings' + MINIMUM_SUPPORTED_RUST_VERSION: 1.80.1 + GOLANG_VERSION: 1.23.3 + +on: + schedule: + - cron: "0 0 1 * *" + pull_request: + branches: + - main + workflow_dispatch: + +concurrency: + group: ${{ github.workflow }}-${{ github.ref }} + cancel-in-progress: true + +permissions: + contents: read + +jobs: + try-it: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - name: Setup JBang + uses: jbangdev/setup-jbang@main + - name: Setup JDK + uses: actions/setup-java@v4 + with: + distribution: 'temurin' + java-version: '21' + - name: Run try-it.sh + run: bash docs/try-it.sh + env: + YOZEFU_API_URL: http://localhost:8082/schemas/types + + wasm-rust: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + - name: Setup Go ${{ env.GOLANG_VERSION }} + uses: actions/setup-go@v5 + with: + go-version: ${{ env.GOLANG_VERSION }} + - name: Install extism CLI + run: go install github.com/extism/cli/extism@latest + - uses: Swatinem/rust-cache@v2 + - name: Temporary fix in Cargo.toml + run: sed -i -E 's#git = "ssh.+#path = "../../wasm-types" }#g' crates/wasm-blueprints/rust/Cargo.toml + - name: Build rust search filter + run: make -C crates/wasm-blueprints/rust/ build test + + wasm-golang: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - name: Setup Go ${{ env.GOLANG_VERSION }} + uses: actions/setup-go@v5 + with: + go-version: ${{ env.GOLANG_VERSION }} + - name: Install extism CLI + run: go install github.com/extism/cli/extism@latest + - uses: acifani/setup-tinygo@v2 + with: + tinygo-version: '0.34.0' + - name: Build golang search filter + run: make -C crates/wasm-blueprints/golang/ build test \ No newline at end of file diff --git a/.github/workflows/publish.yml b/.github/workflows/publish.yml new file mode 100644 index 0000000..8123d7e --- /dev/null +++ b/.github/workflows/publish.yml @@ -0,0 +1,284 @@ +name: Publish + +env: + CARGO_TERM_COLOR: always + RUST_BACKTRACE: 1 + RUSTFLAGS: "-D warnings" + RUSTDOCFLAGS: '--deny warnings' + MINIMUM_SUPPORTED_RUST_VERSION: 1.80.1 + DOCKER_REGISTRY: ghcr.io + +permissions: + id-token: write + packages: write + contents: write + attestations: write + pull-requests: write + +on: + workflow_dispatch: + push: + tags: + - '[0-9]+.[0-9]+.[0-9]+' + workflow_call: +# workflow_run: +# workflows: ['Tag'] +# types: +# - completed + +jobs: + version: + name: Determine version to publish + runs-on: ubuntu-latest + outputs: + version: ${{ steps.release.outputs.version }} + steps: + - uses: actions/checkout@v4 + - name: Install Rust + uses: dtolnay/rust-toolchain@stable + - uses: Swatinem/rust-cache@v2 + - id: release + run: echo "version=$(cargo pkgid --manifest-path crates/bin/Cargo.toml | cut -d '@' -f2)" >> "$GITHUB_OUTPUT" + + create-release: + runs-on: ubuntu-latest + name: Create release + needs: version + steps: + - uses: actions/checkout@v4 + - name: Create github release + run: gh release create "v${{ needs.version.outputs.version }}" --generate-notes + env: + GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} + + changelog: + needs: [version, create-release] + runs-on: ubuntu-latest + steps: + - name: Check out repository + uses: actions/checkout@v4 + with: + fetch-depth: 0 + - run: git fetch --tags origin + - name: Install Rust + uses: dtolnay/rust-toolchain@stable + - uses: Swatinem/rust-cache@v2 + - name: Install git-cliff + run: cargo install git-cliff + - name: Generate a changelog + run: | + if [ -f CHANGELOG.md ]; then + git-cliff --config github --prepend CHANGELOG.md --latest + else + git-cliff --config github --output CHANGELOG.md + fi + env: + OUTPUT: CHANGELOG.md + GITHUB_REPO: ${{ github.repository }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + - name: Commit changelog + run: | + git checkout -b "changelog/${{ needs.version.outputs.version }}" + git config user.name 'github-actions[bot]' + git config user.email 'github-actions[bot]@users.noreply.github.com' + set +e + git add CHANGELOG.md + git commit -m "chore: Update changelog" + git push origin "changelog/${{ needs.version.outputs.version }}" --force + - name: Create pull request for changelog + run: | + alreadyExists=$(gh pr list --json headRefName | jq '.[] | select(.headRefName == "dependabot/cargo/${{ needs.version.outputs.version }}") | any') + if [[ "$alreadyExists" == "" ]]; then + gh pr create --title "Changelog for ${{ needs.version.outputs.version }}" --body "This PR updates the changelog for version ${{ needs.version.outputs.version }}." + fi + env: + GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} + + build-and-publish: + needs: [version, create-release] + name: Build and publish + runs-on: ${{ matrix.platforms.os }} + strategy: + matrix: + platforms: + - os: macOS-latest + target: aarch64-apple-darwin + features: "--features ssl-vendored" + - os: macos-latest-large + target: x86_64-apple-darwin + features: "--features ssl-vendored" + - os: ubuntu-latest + target: x86_64-unknown-linux-gnu + features: "--features ssl-vendored" + - os: ubuntu-latest + target: aarch64-unknown-linux-gnu + features: "--features ssl-vendored" + - os: windows-latest + target: x86_64-pc-windows-gnu + features: "" + - os: windows-latest + target: x86_64-pc-windows-msvc + features: "--features ssl-vendored" + steps: + - uses: actions/checkout@v4 + - uses: houseabsolute/actions-rust-cross@v0.0.17 + with: + target: ${{ matrix.platforms.target }} + args: "--verbose --locked --release ${{ matrix.platforms.features }}" + strip: true + + - name: Setup variables + id: variables + shell: bash + run: | + name="$(cargo metadata --manifest-path crates/bin/Cargo.toml --no-deps --format-version=1 | jq -r '.packages[] | select(.name=="yozefu").targets[0].name')" + source=target/${{ matrix.platforms.target }}/release/${name} + archive="yozefu-${{matrix.platforms.target }}" + subjectName="${name}-${{matrix.platforms.target }}-${{ needs.version.outputs.version }}" + binaryName="${subjectName}" + destination="dist/${binaryName}" + if [ "${{ matrix.platforms.os }}" = "windows-latest" ]; then + source=${source}.exe + binaryName=${binaryName}.exe + destination="dist/${binaryName}" + fi + { + echo "name=${name}" + echo "source=${source}" + echo "archive=${archive}" + echo "subjectName=${subjectName}" + echo "destination=${destination}" + echo "binaryName=${binaryName}" + } >> "$GITHUB_OUTPUT" + + - name: Attest + uses: actions/attest-build-provenance@v1 + with: + subject-path: "${{ steps.variables.outputs.source }}" + subject-name: "${{ steps.variables.outputs.archive }}-${{ needs.version.outputs.version }}" + + - name: Create temp dir + run: mkdir -p dist + + - name: Rename binary + shell: bash + run: mv "${{ steps.variables.outputs.source }}" "${{ steps.variables.outputs.destination }}" + + - name: Build archive for Windows + shell: bash + working-directory: ./dist + if: matrix.platforms.os == 'windows-latest' + run: 7z a "${{ steps.variables.outputs.archive }}.zip" "${{ steps.variables.outputs.binaryName }}" + + - name: Build archive for unix systems + if: matrix.platforms.os != 'windows-latest' + shell: bash + working-directory: ./dist + run: | + tar cvzf "${{ steps.variables.outputs.archive }}.tar.gz" "${{ steps.variables.outputs.binaryName }}" + - name: Clean release directory + shell: bash + run: rm -f "dist/${{ steps.variables.outputs.binaryName }}" + + - name: cargo install cargo-cyclonedx + run: cargo install cargo-cyclonedx + + - name: Generate SBOM + run: cargo cyclonedx --describe binaries --format json ${{ matrix.platforms.features }} --target ${{ matrix.platforms.target }} + + - name: Rename SBOM + shell: bash + run: mv crates/bin/${{ steps.variables.outputs.name }}_bin.cdx.json "dist/${{ steps.variables.outputs.archive }}.cdx.json" + + - name: Upload binary + shell: bash + run: gh release upload "v${{ needs.version.outputs.version }}" dist/* --clobber + env: + GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} + + +# https://docs.github.com/en/actions/use-cases-and-examples/publishing-packages/publishing-docker-images + publish-docker-image: + name: Docker image + runs-on: ubuntu-latest + needs: [version, create-release] + permissions: + contents: read + packages: write + attestations: write + id-token: write + steps: + - name: Checkout + uses: actions/checkout@v4 + - name: Set up Docker Buildx + uses: docker/setup-buildx-action@v3 + - name: Extract metadata + id: meta + uses: docker/metadata-action@v5 + with: + tags: | + type=raw,value=latest,enable={{is_default_branch}} + type=raw,value=${{ needs.version.outputs.version }} + images: ${{ env.DOCKER_REGISTRY }}/${{ github.repository }} + labels: | + org.opencontainers.image.description=Yozefu is a CLI tool for Apache kafka. It allows you to navigate topics and search Kafka records. + org.opencontainers.image.vendor=Yann Prono + org.opencontainers.image.licenses=Apache-2.0 + env: + DOCKER_METADATA_ANNOTATIONS_LEVELS: manifest,index + - name: Login to GitHub Container Registry + uses: docker/login-action@v3 + with: + registry: ghcr.io + username: ${{ github.actor }} + password: ${{ secrets.GITHUB_TOKEN }} + - name: Build and push + uses: docker/build-push-action@v6 + id: push + with: + push: true + tags: ${{ steps.meta.outputs.tags }} + cache-from: type=gha + cache-to: type=gha,mode=max + labels: ${{ steps.meta.outputs.labels }} + sbom: true + annotations: ${{ steps.meta.outputs.annotations }} + # https://github.com/actions/attest-build-provenance + - name: Generate artifact attestation + if: false + uses: actions/attest-build-provenance@v1 + with: + subject-name: ${{ env.DOCKER_REGISTRY }}/${{ github.repository }} + subject-digest: ${{ steps.push.outputs.digest }} + push-to-registry: true + + publish-to-registry: + runs-on: ubuntu-latest + name: Publish to registry + needs: [version, build-and-publish] + if: github.ref == 'refs/heads/main' + permissions: + contents: write + steps: + - name: Checkout + uses: actions/checkout@v4 + - name: Install Rust + uses: dtolnay/rust-toolchain@stable + - uses: Swatinem/rust-cache@v2 + - name: Make sure crates are ready + continue-on-error: true + run: | + cargo publish --dry-run -p yozefu-lib + cargo publish --dry-run -p yozefu-wasm-types + cargo publish --dry-run -p yozefu-app + cargo publish --dry-run -p yozefu-tui + cargo publish --dry-run -p yozefu-command + cargo publish --dry-run -p yozefu-bin + - name: Publish the crates + run: | + cargo publish --token ${{ secrets.CARGO_REGISTRY_TOKEN }} -p yozefu-lib + cargo publish --token ${{ secrets.CARGO_REGISTRY_TOKEN }} -p yozefu-wasm-types + cargo publish --token ${{ secrets.CARGO_REGISTRY_TOKEN }} -p yozefu-app + cargo publish --token ${{ secrets.CARGO_REGISTRY_TOKEN }} -p yozefu-tui + cargo publish --token ${{ secrets.CARGO_REGISTRY_TOKEN }} -p yozefu-command + cargo publish --token ${{ secrets.CARGO_REGISTRY_TOKEN }} -p yozefu-bin \ No newline at end of file diff --git a/.github/workflows/security.yml b/.github/workflows/security.yml new file mode 100644 index 0000000..0ddf26c --- /dev/null +++ b/.github/workflows/security.yml @@ -0,0 +1,50 @@ +name: Security + +env: + CARGO_TERM_COLOR: always + RUST_BACKTRACE: 1 + RUSTFLAGS: "-D warnings" + RUSTDOCFLAGS: '--deny warnings' + MINIMUM_SUPPORTED_RUST_VERSION: 1.80.1 + +on: + schedule: + - cron: "0 0 1 * *" + pull_request: + branches: + - main + workflow_dispatch: + +concurrency: + group: ${{ github.workflow }}-${{ github.ref }} + cancel-in-progress: true + +permissions: + contents: read + +jobs: + dependencies: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: Swatinem/rust-cache@v2 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + - name: Install cargo-edit + run: cargo install cargo-edit + - name: Check for outdated dependencies + run: cargo upgrade --dry-run --locked + + audit: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: Swatinem/rust-cache@v2 + - uses: actions-rust-lang/setup-rust-toolchain@v1 + with: + toolchain: stable + - name: Install cargo-outdated + run: cargo install cargo-audit + - name: Cargo audit + run: cargo audit \ No newline at end of file diff --git a/.github/workflows/semver-checks.yml b/.github/workflows/semver-checks.yml new file mode 100644 index 0000000..dc87675 --- /dev/null +++ b/.github/workflows/semver-checks.yml @@ -0,0 +1,99 @@ +name: Semver checks + +env: + CARGO_TERM_COLOR: always + RUST_BACKTRACE: 1 + RUSTFLAGS: "-D warnings" + RUSTDOCFLAGS: '--deny warnings' + MINIMUM_SUPPORTED_RUST_VERSION: 1.80.1 + +permissions: + packages: write + contents: write + +on: + pull_request: + branches: + - main + paths-ignore: + - CHANGELOG.md + workflow_dispatch: + workflow_run: + workflows: ['Prepare a release'] + types: + - completed + + +concurrency: + group: ${{ github.workflow }}-${{ github.ref }} + cancel-in-progress: true + +jobs: + check-branches: + runs-on: ubuntu-latest + steps: + - name: you can't run this action on 'main' branch + run: | + if [[ "${{ github.ref_name }}" = "main" ]]; then + exit 1 + fi + cargo-semver-checks: + needs: check-branches + permissions: + pull-requests: write + contents: write + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + with: + fetch-depth: 0 + - uses: dtolnay/rust-toolchain@master + with: + toolchain: stable + - uses: Swatinem/rust-cache@v2 + - name: Install cargo-semver-checks + run: cargo install cargo-semver-checks + - name: List the releases on GitHub + id: current + run: echo "version=$(git tag --sort=-creatordate | head -n 1)" >> "$GITHUB_OUTPUT" + - name: Get next version + id: next + run: echo "version=v$(cargo pkgid --manifest-path crates/bin/Cargo.toml | cut -d '@' -f2)" >> "$GITHUB_OUTPUT" + - name: Make sure version is updated + if: ${{ steps.current.outputs.version == steps.next.outputs.version }} + run: | + echo "::warning title=Next version:: Last public version is '${{ steps.current.outputs.version }}' but version of this branch is '${{ steps.next.outputs.version }}'. Did you forget to update the version? More details at https://github.com/MAIF/yozefu/blob/main/docs/release.md" + printf 'This pull request is not ready because the crate version is equals to the latest git tag version `' > report.md + printf "${{ steps.next.outputs.version }}" >> report.md + printf '`. I think you forgot to bump the version. More details at https://github.com/MAIF/yozefu/blob/main/docs/release.md' >> report.md + gh pr comment ${{ github.event.number }} --body-file ./report.md + exit 1 + env: + GH_TOKEN: ${{ github.token }} + + - name: Show release type + id: semver + run: | + wget -O semver "https://raw.githubusercontent.com/fsaintjacques/semver-tool/master/src/semver" + chmod u+x ./semver + echo "release=$(./semver diff ${{ steps.current.outputs.version }} ${{ steps.next.outputs.version }})" >> "$GITHUB_OUTPUT" + - name: Prepare report.md + if: ${{ steps.next.outputs.version == '' }} + run: | + { + echo "> [!WARNING]" + echo "> According to \`cargo-semver-checks\`, the next release version doesn\'t respect semantic versioning." + echo '```bash' + } > ./report.md + - name: Run cargo semver-checks + if: ${{ steps.next.outputs.version != '' && steps.semver.outputs.release != '' }} + id: check + run: cargo semver-checks --color never --package yozefu-lib --package yozefu-app --package yozefu-types --baseline-rev "${{ steps.current.outputs.version }}" --release-type "${{ steps.semver.outputs.release }}" >> "$GITHUB_OUTPUT" + - name: Publish semver-checks report + if: ${{ steps.check.outcome != 'success' && steps.check.outcome != 'skipped' }} + run: | + echo "${{ steps.check.outputs.version }}" >> report.md + printf '\n```' >> ./report.md + gh pr comment ${{ github.event.number }} --body-file ./report.md + env: + GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} \ No newline at end of file diff --git a/.github/workflows/tag.yml b/.github/workflows/tag.yml new file mode 100644 index 0000000..7af8feb --- /dev/null +++ b/.github/workflows/tag.yml @@ -0,0 +1,62 @@ +name: Tag + +env: + CARGO_TERM_COLOR: always + RUST_BACKTRACE: 1 + RUSTFLAGS: "-D warnings" + RUSTDOCFLAGS: '--deny warnings' + MINIMUM_SUPPORTED_RUST_VERSION: 1.80.1 + +permissions: + contents: write + actions: write + packages: write + attestations: write + id-token: write + pull-requests: write +on: + push: + branches: + - main + paths-ignore: + - CHANGELOG.md + +jobs: + tag: + runs-on: ubuntu-latest + if: github.ref == 'refs/heads/main' + outputs: + created: ${{ steps.tag.outputs.created }} + permissions: + contents: write + steps: + - name: Checkout + uses: actions/checkout@v4 + - name: Get tags + run: git fetch --tags origin + - name: Install Rust + uses: dtolnay/rust-toolchain@stable + - uses: Swatinem/rust-cache@v2 + - name: Configure git + run: | + git config --global user.name "${{ github.actor }}" + git config --global user.email "${{ github.actor_id }}+${{ github.actor }}@users.noreply.github.com" + git config --global push.autoSetupRemote true + - name: Get the version + id: current + run: echo "version=$(cargo pkgid --manifest-path crates/bin/Cargo.toml | cut -d '@' -f2)" >> "$GITHUB_OUTPUT" + - name: Create git tag + id: tag + run: | + if git tag -a "v${{ steps.current.outputs.version }}" -m "${{ steps.current.outputs.version }}"; then + echo "created=true" >> "$GITHUB_OUTPUT" + else + echo "::notice title=Git tag:: The version ${{ steps.current.outputs.version }} already exists. The \`publish\` workflow won't be executed." + echo "created=false" >> "$GITHUB_OUTPUT" + fi + + publish: + needs: tag + if: ${{ needs.tag.outputs.created == 'true' }} + uses: ./.github/workflows/publish.yml + secrets: inherit diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..cdfb12f --- /dev/null +++ b/.gitignore @@ -0,0 +1,24 @@ +/target +.vscode +**/*DS_Store +akhq +tarpaulin-report.html +lol +lol/rust/Cargo.lock +semver-checks/ +demo.sh +ee.json + +crates/**/*.cdx.xml +crates/**/*.cdx.json +crates/wasm-blueprints/golang/go.sum +kpow.env +.idea/ +lol-export +crates/wasm-blueprints/**/module.wasm +crates/wasm-blueprints/rust/Cargo.lock +yozefu/ +dump +docs/avro-tools.jar +yozefu-exports/ +.env \ No newline at end of file diff --git a/Cargo.lock b/Cargo.lock new file mode 100644 index 0000000..9ffa73b --- /dev/null +++ b/Cargo.lock @@ -0,0 +1,4921 @@ +# This file is automatically @generated by Cargo. +# It is not intended for manual editing. +version = 4 + +[[package]] +name = "addr2line" +version = "0.24.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "dfbe277e56a376000877090da837660b4427aad530e3028d44e0bffe4f89a1c1" +dependencies = [ + "gimli", +] + +[[package]] +name = "adler2" +version = "2.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "512761e0bb2578dd7380c6baaa0f4ce03e84f95e960231d1dec8bf4d7d6e2627" + +[[package]] +name = "adler32" +version = "1.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "aae1277d39aeec15cb388266ecc24b11c80469deae6067e17a1a7aa9e5c1f234" + +[[package]] +name = "ahash" +version = "0.8.11" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e89da841a80418a9b391ebaea17f5c112ffaaa96f621d2c285b5174da76b9011" +dependencies = [ + "cfg-if", + "once_cell", + "version_check", + "zerocopy", +] + +[[package]] +name = "aho-corasick" +version = "1.1.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8e60d3430d3a69478ad0993f19238d2df97c507009a52b3c10addcd7f6bcb916" +dependencies = [ + "memchr", +] + +[[package]] +name = "allocator-api2" +version = "0.2.21" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "683d7910e743518b0e34f1186f92494becacb047c7b6bf616c96772180fef923" + +[[package]] +name = "ambient-authority" +version = "0.0.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e9d4ee0d472d1cd2e28c97dfa124b3d8d992e10eb0a035f33f5d12e3a177ba3b" + +[[package]] +name = "android-tzdata" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e999941b234f3131b00bc13c22d06e8c5ff726d1b6318ac7eb276997bbb4fef0" + +[[package]] +name = "android_system_properties" +version = "0.1.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "819e7219dbd41043ac279b19830f2efc897156490d7fd6ea916720117ee66311" +dependencies = [ + "libc", +] + +[[package]] +name = "anstream" +version = "0.6.18" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8acc5369981196006228e28809f761875c0327210a891e941f4c683b3a99529b" +dependencies = [ + "anstyle", + "anstyle-parse", + "anstyle-query", + "anstyle-wincon", + "colorchoice", + "is_terminal_polyfill", + "utf8parse", +] + +[[package]] +name = "anstyle" +version = "1.0.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "55cc3b69f167a1ef2e161439aa98aed94e6028e5f9a59be9a6ffb47aef1651f9" + +[[package]] +name = "anstyle-parse" +version = "0.2.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3b2d16507662817a6a20a9ea92df6652ee4f94f914589377d69f3b21bc5798a9" +dependencies = [ + "utf8parse", +] + +[[package]] +name = "anstyle-query" +version = "1.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "79947af37f4177cfead1110013d678905c37501914fba0efea834c3fe9a8d60c" +dependencies = [ + "windows-sys 0.59.0", +] + +[[package]] +name = "anstyle-wincon" +version = "3.0.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2109dbce0e72be3ec00bed26e6a7479ca384ad226efdd66db8fa2e3a38c83125" +dependencies = [ + "anstyle", + "windows-sys 0.59.0", +] + +[[package]] +name = "anyhow" +version = "1.0.94" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c1fd03a028ef38ba2276dce7e33fcd6369c158a1bca17946c4b1b701891c1ff7" + +[[package]] +name = "apache-avro" +version = "0.17.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1aef82843a0ec9f8b19567445ad2421ceeb1d711514384bdd3d49fe37102ee13" +dependencies = [ + "bigdecimal", + "digest", + "libflate", + "log", + "num-bigint", + "quad-rand", + "rand", + "regex-lite", + "serde", + "serde_bytes", + "serde_json", + "strum", + "strum_macros", + "thiserror", + "typed-builder", + "uuid", +] + +[[package]] +name = "arbitrary" +version = "1.4.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "dde20b3d026af13f561bdd0f15edf01fc734f0dafcedbaf42bba506a9517f223" + +[[package]] +name = "async-trait" +version = "0.1.83" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "721cae7de5c34fbb2acd27e21e6d2cf7b886dce0c27388d46c4e6c47ea4318dd" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "atomic-waker" +version = "1.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1505bd5d3d116872e7271a6d4e16d81d0c8570876c8de68093a09ac269d8aac0" + +[[package]] +name = "autocfg" +version = "1.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ace50bade8e6234aa140d9a2f552bbee1db4d353f69b8217bc503490fc1a9f26" + +[[package]] +name = "backtrace" +version = "0.3.74" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8d82cb332cdfaed17ae235a638438ac4d4839913cc2af585c3c6746e8f8bee1a" +dependencies = [ + "addr2line", + "cfg-if", + "libc", + "miniz_oxide", + "object", + "rustc-demangle", + "windows-targets 0.52.6", +] + +[[package]] +name = "base64" +version = "0.21.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9d297deb1925b89f2ccc13d7635fa0714f12c87adce1c75356b39ca9b7178567" + +[[package]] +name = "base64" +version = "0.22.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "72b3254f16251a8381aa12e40e3c4d2f0199f8c6508fbecb9d91f575e0fbb8c6" + +[[package]] +name = "bigdecimal" +version = "0.4.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7f31f3af01c5c65a07985c804d3366560e6fa7883d640a122819b14ec327482c" +dependencies = [ + "autocfg", + "libm", + "num-bigint", + "num-integer", + "num-traits", + "serde", +] + +[[package]] +name = "bitflags" +version = "2.6.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b048fb63fd8b5923fc5aa7b340d8e156aec7ec02f0c78fa8a6ddc2613f6f71de" +dependencies = [ + "serde", +] + +[[package]] +name = "block" +version = "0.1.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0d8c1fef690941d3e7788d328517591fecc684c084084702d6ff1641e993699a" + +[[package]] +name = "block-buffer" +version = "0.10.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3078c7629b62d3f0439517fa394996acacc5cbc91c5a20d8c658e77abd503a71" +dependencies = [ + "generic-array", +] + +[[package]] +name = "bstr" +version = "1.11.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1a68f1f47cdf0ec8ee4b941b2eee2a80cb796db73118c0dd09ac63fbe405be22" +dependencies = [ + "memchr", + "serde", +] + +[[package]] +name = "bumpalo" +version = "3.16.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "79296716171880943b8470b5f8d03aa55eb2e645a4874bdbb28adb49162e012c" + +[[package]] +name = "bytemuck" +version = "1.20.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8b37c88a63ffd85d15b406896cc343916d7cf57838a847b3a6f2ca5d39a5695a" + +[[package]] +name = "byteorder" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1fd0f2584146f6f2ef48085050886acf353beff7305ebd1ae69500e27c67f64b" + +[[package]] +name = "bytes" +version = "1.9.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "325918d6fe32f23b19878fe4b34794ae41fc19ddbe53b10571a4874d44ffd39b" + +[[package]] +name = "bytesize" +version = "1.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a3e368af43e418a04d52505cf3dbc23dda4e3407ae2fa99fd0e4f308ce546acc" + +[[package]] +name = "calloop" +version = "0.13.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b99da2f8558ca23c71f4fd15dc57c906239752dd27ff3c00a1d56b685b7cbfec" +dependencies = [ + "bitflags", + "log", + "polling", + "rustix", + "slab", + "thiserror", +] + +[[package]] +name = "calloop-wayland-source" +version = "0.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "95a66a987056935f7efce4ab5668920b5d0dac4a7c99991a67395f13702ddd20" +dependencies = [ + "calloop", + "rustix", + "wayland-backend", + "wayland-client", +] + +[[package]] +name = "cap-fs-ext" +version = "3.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7f78efdd7378980d79c0f36b519e51191742d2c9f91ffa5e228fba9f3806d2e1" +dependencies = [ + "cap-primitives", + "cap-std", + "io-lifetimes", + "windows-sys 0.59.0", +] + +[[package]] +name = "cap-primitives" +version = "3.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8fc15faeed2223d8b8e8cc1857f5861935a06d06713c4ac106b722ae9ce3c369" +dependencies = [ + "ambient-authority", + "fs-set-times", + "io-extras", + "io-lifetimes", + "ipnet", + "maybe-owned", + "rustix", + "windows-sys 0.59.0", + "winx", +] + +[[package]] +name = "cap-rand" +version = "3.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "dea13372b49df066d1ae654e5c6e41799c1efd9f6b36794b921e877ea4037977" +dependencies = [ + "ambient-authority", + "rand", +] + +[[package]] +name = "cap-std" +version = "3.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c3dbd3e8e8d093d6ccb4b512264869e1281cdb032f7940bd50b2894f96f25609" +dependencies = [ + "cap-primitives", + "io-extras", + "io-lifetimes", + "rustix", +] + +[[package]] +name = "cap-time-ext" +version = "3.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bd736b20fc033f564a1995fb82fc349146de43aabba19c7368b4cb17d8f9ea53" +dependencies = [ + "ambient-authority", + "cap-primitives", + "iana-time-zone", + "once_cell", + "rustix", + "winx", +] + +[[package]] +name = "cassowary" +version = "0.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "df8670b8c7b9dae1793364eafadf7239c40d669904660c5960d74cfd80b46a53" + +[[package]] +name = "castaway" +version = "0.2.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0abae9be0aaf9ea96a3b1b8b1b55c602ca751eba1b1500220cea4ecbafe7c0d5" +dependencies = [ + "rustversion", +] + +[[package]] +name = "cbindgen" +version = "0.27.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3fce8dd7fcfcbf3a0a87d8f515194b49d6135acab73e18bd380d1d93bb1a15eb" +dependencies = [ + "heck 0.4.1", + "indexmap", + "log", + "proc-macro2", + "quote", + "serde", + "serde_json", + "syn", + "tempfile", + "toml", +] + +[[package]] +name = "cc" +version = "1.2.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "27f657647bcff5394bf56c7317665bbf790a137a50eaaa5c6bfbb9e27a518f2d" +dependencies = [ + "jobserver", + "libc", + "shlex", +] + +[[package]] +name = "cfg-if" +version = "1.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd" + +[[package]] +name = "chrono" +version = "0.4.39" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7e36cc9d416881d2e24f9a963be5fb1cd90966419ac844274161d10488b3e825" +dependencies = [ + "android-tzdata", + "iana-time-zone", + "js-sys", + "num-traits", + "serde", + "wasm-bindgen", + "windows-targets 0.52.6", +] + +[[package]] +name = "circular-buffer" +version = "0.1.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b67261db007b5f4cf8cba393c1a5c511a5cc072339ce16e12aeba1d7b9b77946" + +[[package]] +name = "clap" +version = "4.5.23" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3135e7ec2ef7b10c6ed8950f0f792ed96ee093fa088608f1c76e569722700c84" +dependencies = [ + "clap_builder", + "clap_derive", +] + +[[package]] +name = "clap_builder" +version = "4.5.23" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "30582fc632330df2bd26877bde0c1f4470d57c582bbc070376afcd04d8cb4838" +dependencies = [ + "anstream", + "anstyle", + "clap_lex", + "strsim", +] + +[[package]] +name = "clap_derive" +version = "4.5.18" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4ac6a0c7b1a9e9a5186361f67dfa1b88213572f427fb9ab038efb2bd8c582dab" +dependencies = [ + "heck 0.5.0", + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "clap_lex" +version = "0.7.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f46ad14479a25103f283c0f10005961cf086d8dc42205bb44c46ac563475dca6" + +[[package]] +name = "clipboard-win" +version = "3.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9fdf5e01086b6be750428ba4a40619f847eb2e95756eee84b18e06e5f0b50342" +dependencies = [ + "lazy-bytes-cast", + "winapi", +] + +[[package]] +name = "cmake" +version = "0.1.52" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c682c223677e0e5b6b7f63a64b9351844c3f1b1678a68b7ee617e30fb082620e" +dependencies = [ + "cc", +] + +[[package]] +name = "cobs" +version = "0.2.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "67ba02a97a2bd10f4b59b25c7973101c79642302776489e030cd13cdab09ed15" + +[[package]] +name = "colorchoice" +version = "1.0.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5b63caa9aa9397e2d9480a9b13673856c78d8ac123288526c37d7839f2a86990" + +[[package]] +name = "compact_str" +version = "0.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6050c3a16ddab2e412160b31f2c871015704239bca62f72f6e5f0be631d3f644" +dependencies = [ + "castaway", + "cfg-if", + "itoa", + "rustversion", + "ryu", + "serde", + "static_assertions", +] + +[[package]] +name = "concurrent-queue" +version = "2.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4ca0197aee26d1ae37445ee532fefce43251d24cc7c166799f4d46817f1d3973" +dependencies = [ + "crossbeam-utils", +] + +[[package]] +name = "console" +version = "0.15.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0e1f83fc076bd6dd27517eacdf25fef6c4dfe5f1d7448bafaaf3a26f13b5e4eb" +dependencies = [ + "encode_unicode", + "lazy_static", + "libc", + "unicode-width 0.1.14", + "windows-sys 0.52.0", +] + +[[package]] +name = "copypasta" +version = "0.10.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "deb85422867ca93da58b7f95fb5c0c10f6183ed6e1ef8841568968a896d3a858" +dependencies = [ + "clipboard-win", + "objc", + "objc-foundation", + "objc_id", + "smithay-clipboard", + "x11-clipboard", +] + +[[package]] +name = "core-foundation" +version = "0.9.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "91e195e091a93c46f7102ec7818a2aa394e1e1771c3ab4825963fa03e45afb8f" +dependencies = [ + "core-foundation-sys", + "libc", +] + +[[package]] +name = "core-foundation-sys" +version = "0.8.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "773648b94d0e5d620f64f280777445740e61fe701025087ec8b57f45c791888b" + +[[package]] +name = "core2" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b49ba7ef1ad6107f8824dbe97de947cbaac53c44e7f9756a1fba0d37c1eec505" +dependencies = [ + "memchr", +] + +[[package]] +name = "cpp_demangle" +version = "0.4.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "96e58d342ad113c2b878f16d5d034c03be492ae460cdbc02b7f0f2284d310c7d" +dependencies = [ + "cfg-if", +] + +[[package]] +name = "cpufeatures" +version = "0.2.16" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "16b80225097f2e5ae4e7179dd2266824648f3e2f49d9134d584b76389d31c4c3" +dependencies = [ + "libc", +] + +[[package]] +name = "cranelift-bforest" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "540b193ff98b825a1f250a75b3118911af918a734154c69d80bcfcf91e7e9522" +dependencies = [ + "cranelift-entity", +] + +[[package]] +name = "cranelift-bitset" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c7cb269598b9557ab942d687d3c1086d77c4b50dcf35813f3a65ba306fd42279" +dependencies = [ + "serde", + "serde_derive", +] + +[[package]] +name = "cranelift-codegen" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "46566d7c83a8bff4150748d66020f4c7224091952aa4b4df1ec4959c39d937a1" +dependencies = [ + "bumpalo", + "cranelift-bforest", + "cranelift-bitset", + "cranelift-codegen-meta", + "cranelift-codegen-shared", + "cranelift-control", + "cranelift-entity", + "cranelift-isle", + "gimli", + "hashbrown 0.14.5", + "log", + "regalloc2", + "rustc-hash", + "smallvec", + "target-lexicon", +] + +[[package]] +name = "cranelift-codegen-meta" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2df8a86a34236cc75a8a6a271973da779c2aeb36c43b6e14da474cf931317082" +dependencies = [ + "cranelift-codegen-shared", +] + +[[package]] +name = "cranelift-codegen-shared" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cf75340b6a57b7c7c1b74f10d3d90883ee6d43a554be8131a4046c2ebcf5eb65" + +[[package]] +name = "cranelift-control" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2e84495bc5d23d86aad8c86f8ade4af765b94882af60d60e271d3153942f1978" +dependencies = [ + "arbitrary", +] + +[[package]] +name = "cranelift-entity" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "963c17147b80df351965e57c04d20dbedc85bcaf44c3436780a59a3f1ff1b1c2" +dependencies = [ + "cranelift-bitset", + "serde", + "serde_derive", +] + +[[package]] +name = "cranelift-frontend" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "727f02acbc4b4cb2ba38a6637101d579db50190df1dd05168c68e762851a3dd5" +dependencies = [ + "cranelift-codegen", + "log", + "smallvec", + "target-lexicon", +] + +[[package]] +name = "cranelift-isle" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "32b00cc2e03c748f2531eea01c871f502b909d30295fdcad43aec7bf5c5b4667" + +[[package]] +name = "cranelift-native" +version = "0.113.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bbeaf978dc7c1a2de8bbb9162510ed218eb156697bc45590b8fbdd69bb08e8de" +dependencies = [ + "cranelift-codegen", + "libc", + "target-lexicon", +] + +[[package]] +name = "crc32fast" +version = "1.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a97769d94ddab943e4510d138150169a2758b5ef3eb191a9ee688de3e23ef7b3" +dependencies = [ + "cfg-if", +] + +[[package]] +name = "crossbeam-deque" +version = "0.8.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "613f8cc01fe9cf1a3eb3d7f488fd2fa8388403e97039e2f73692932e291a770d" +dependencies = [ + "crossbeam-epoch", + "crossbeam-utils", +] + +[[package]] +name = "crossbeam-epoch" +version = "0.9.18" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5b82ac4a3c2ca9c3460964f020e1402edd5753411d7737aa39c3714ad1b5420e" +dependencies = [ + "crossbeam-utils", +] + +[[package]] +name = "crossbeam-utils" +version = "0.8.20" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "22ec99545bb0ed0ea7bb9b8e1e9122ea386ff8a48c0922e43f36d45ab09e0e80" + +[[package]] +name = "crossterm" +version = "0.28.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "829d955a0bb380ef178a640b91779e3987da38c9aea133b20614cfed8cdea9c6" +dependencies = [ + "bitflags", + "crossterm_winapi", + "futures-core", + "mio", + "parking_lot", + "rustix", + "signal-hook", + "signal-hook-mio", + "winapi", +] + +[[package]] +name = "crossterm_winapi" +version = "0.9.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "acdd7c62a3665c7f6830a51635d9ac9b23ed385797f70a83bb8bafe9c572ab2b" +dependencies = [ + "winapi", +] + +[[package]] +name = "crypto-common" +version = "0.1.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1bfb12502f3fc46cca1bb51ac28df9d618d813cdc3d2f25b9fe775a34af26bb3" +dependencies = [ + "generic-array", + "typenum", +] + +[[package]] +name = "curl-sys" +version = "0.4.78+curl-8.11.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8eec768341c5c7789611ae51cf6c459099f22e64a5d5d0ce4892434e33821eaf" +dependencies = [ + "cc", + "libc", + "libz-sys", + "openssl-sys", + "pkg-config", + "vcpkg", + "windows-sys 0.52.0", +] + +[[package]] +name = "cursor-icon" +version = "1.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "96a6ac251f4a2aca6b3f91340350eab87ae57c3f127ffeb585e92bd336717991" + +[[package]] +name = "darling" +version = "0.20.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6f63b86c8a8826a49b8c21f08a2d07338eec8d900540f8630dc76284be802989" +dependencies = [ + "darling_core", + "darling_macro", +] + +[[package]] +name = "darling_core" +version = "0.20.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "95133861a8032aaea082871032f5815eb9e98cef03fa916ab4500513994df9e5" +dependencies = [ + "fnv", + "ident_case", + "proc-macro2", + "quote", + "strsim", + "syn", +] + +[[package]] +name = "darling_macro" +version = "0.20.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d336a2a514f6ccccaa3e09b02d41d35330c07ddf03a62165fcec10bb561c7806" +dependencies = [ + "darling_core", + "quote", + "syn", +] + +[[package]] +name = "dary_heap" +version = "0.3.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "04d2cd9c18b9f454ed67da600630b021a8a80bf33f8c95896ab33aaf1c26b728" + +[[package]] +name = "debugid" +version = "0.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bef552e6f588e446098f6ba40d89ac146c8c7b64aade83c051ee00bb5d2bc18d" +dependencies = [ + "uuid", +] + +[[package]] +name = "diff" +version = "0.1.13" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "56254986775e3233ffa9c4d7d3faaf6d36a2c09d30b20687e9f88bc8bafc16c8" + +[[package]] +name = "digest" +version = "0.10.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9ed9a281f7bc9b7576e61468ba615a66a5c8cfdff42420a70aa82701a3b1e292" +dependencies = [ + "block-buffer", + "crypto-common", +] + +[[package]] +name = "directories" +version = "5.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9a49173b84e034382284f27f1af4dcbbd231ffa358c0fe316541a7337f376a35" +dependencies = [ + "dirs-sys 0.4.1", +] + +[[package]] +name = "directories-next" +version = "2.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "339ee130d97a610ea5a5872d2bbb130fdf68884ff09d3028b81bec8a1ac23bbc" +dependencies = [ + "cfg-if", + "dirs-sys-next", +] + +[[package]] +name = "dirs" +version = "4.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ca3aa72a6f96ea37bbc5aa912f6788242832f75369bdfdadcb0e38423f100059" +dependencies = [ + "dirs-sys 0.3.7", +] + +[[package]] +name = "dirs-sys" +version = "0.3.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1b1d1d91c932ef41c0f2663aa8b0ca0342d444d842c06914aa0a7e352d0bada6" +dependencies = [ + "libc", + "redox_users", + "winapi", +] + +[[package]] +name = "dirs-sys" +version = "0.4.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "520f05a5cbd335fae5a99ff7a6ab8627577660ee5cfd6a94a6a929b52ff0321c" +dependencies = [ + "libc", + "option-ext", + "redox_users", + "windows-sys 0.48.0", +] + +[[package]] +name = "dirs-sys-next" +version = "0.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4ebda144c4fe02d1f7ea1a7d9641b6fc6b580adcfa024ae48797ecdeb6825b4d" +dependencies = [ + "libc", + "redox_users", + "winapi", +] + +[[package]] +name = "displaydoc" +version = "0.2.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "97369cbbc041bc366949bc74d34658d6cda5621039731c6310521892a3a20ae0" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "dlib" +version = "0.5.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "330c60081dcc4c72131f8eb70510f1ac07223e5d4163db481a04a0befcffa412" +dependencies = [ + "libloading", +] + +[[package]] +name = "downcast-rs" +version = "1.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "75b325c5dbd37f80359721ad39aca5a29fb04c89279657cffdda8736d0c0b9d2" + +[[package]] +name = "either" +version = "1.13.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "60b1af1c220855b6ceac025d3f6ecdd2b7c4894bfe9cd9bda4fbb4bc7c0d4cf0" + +[[package]] +name = "embedded-io" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ef1a6892d9eef45c8fa6b9e0086428a2cca8491aca8f787c534a3d6d0bcb3ced" + +[[package]] +name = "embedded-io" +version = "0.6.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "edd0f118536f44f5ccd48bcb8b111bdc3de888b58c74639dfb034a357d0f206d" + +[[package]] +name = "encode_unicode" +version = "0.3.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a357d28ed41a50f9c765dbfe56cbc04a64e53e5fc58ba79fbc34c10ef3df831f" + +[[package]] +name = "encoding_rs" +version = "0.8.35" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "75030f3c4f45dafd7586dd6780965a8c7e8e285a5ecb86713e63a79c5b2766f3" +dependencies = [ + "cfg-if", +] + +[[package]] +name = "env_filter" +version = "0.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4f2c92ceda6ceec50f43169f9ee8424fe2db276791afde7b2cd8bc084cb376ab" +dependencies = [ + "log", + "regex", +] + +[[package]] +name = "env_logger" +version = "0.11.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e13fa619b91fb2381732789fc5de83b45675e882f66623b7d8cb4f643017018d" +dependencies = [ + "anstream", + "anstyle", + "env_filter", + "humantime", + "log", +] + +[[package]] +name = "equivalent" +version = "1.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5443807d6dff69373d433ab9ef5378ad8df50ca6298caf15de6e52e24aaf54d5" + +[[package]] +name = "errno" +version = "0.3.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "33d852cb9b869c2a9b3df2f71a3074817f01e1844f839a144f5fcef059a4eb5d" +dependencies = [ + "libc", + "windows-sys 0.59.0", +] + +[[package]] +name = "extism" +version = "1.9.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f03e2cac5668dead4088aa9da25c9985f1a1b72edd3e31b201d2c044647b56f2" +dependencies = [ + "anyhow", + "cbindgen", + "extism-convert", + "extism-manifest", + "glob", + "libc", + "serde", + "serde_json", + "sha2", + "toml", + "tracing", + "tracing-subscriber", + "ureq", + "url", + "uuid", + "wasi-common", + "wasmtime", + "wiggle", +] + +[[package]] +name = "extism-convert" +version = "1.9.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a33423cbb1226c483f47a9cad883ee804caf45d8b0d2e5c03cd33d9e43ca5561" +dependencies = [ + "anyhow", + "base64 0.22.1", + "bytemuck", + "extism-convert-macros", + "prost", + "rmp-serde", + "serde", + "serde_json", +] + +[[package]] +name = "extism-convert-macros" +version = "1.9.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8bb132f6e20aab7e8eb3715e26ff8893c611abba73242e00771d740a0fbcb384" +dependencies = [ + "manyhow", + "proc-macro-crate", + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "extism-manifest" +version = "1.9.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6d9c8b50558af0a75ce08b8ef90a37fb018d99cc99a5f7365c33ba008afdbfb4" +dependencies = [ + "base64 0.22.1", + "serde", + "serde_json", +] + +[[package]] +name = "fallible-iterator" +version = "0.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2acce4a10f12dc2fb14a218589d4f1f62ef011b2d0cc4b3cb1bba8e94da14649" + +[[package]] +name = "fastrand" +version = "2.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "37909eebbb50d72f9059c3b6d82c0463f2ff062c9e95845c43a6c9c0355411be" + +[[package]] +name = "fd-lock" +version = "4.0.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7e5768da2206272c81ef0b5e951a41862938a6070da63bcea197899942d3b947" +dependencies = [ + "cfg-if", + "rustix", + "windows-sys 0.52.0", +] + +[[package]] +name = "flate2" +version = "1.0.35" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c936bfdafb507ebbf50b8074c54fa31c5be9a1e7e5f467dd659697041407d07c" +dependencies = [ + "crc32fast", + "miniz_oxide", +] + +[[package]] +name = "fnv" +version = "1.0.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1" + +[[package]] +name = "foldhash" +version = "0.1.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f81ec6369c545a7d40e4589b5597581fa1c441fe1cce96dd1de43159910a36a2" + +[[package]] +name = "foreign-types" +version = "0.3.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1" +dependencies = [ + "foreign-types-shared", +] + +[[package]] +name = "foreign-types-shared" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b" + +[[package]] +name = "form_urlencoded" +version = "1.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e13624c2627564efccf4934284bdd98cbaa14e79b0b5a141218e507b3a823456" +dependencies = [ + "percent-encoding", +] + +[[package]] +name = "fs-set-times" +version = "0.20.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5e2e6123af26f0f2c51cc66869137080199406754903cc926a7690401ce09cb4" +dependencies = [ + "io-lifetimes", + "rustix", + "windows-sys 0.59.0", +] + +[[package]] +name = "futures" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "65bc07b1a8bc7c85c5f2e110c476c7389b4554ba72af57d8445ea63a576b0876" +dependencies = [ + "futures-channel", + "futures-core", + "futures-executor", + "futures-io", + "futures-sink", + "futures-task", + "futures-util", +] + +[[package]] +name = "futures-channel" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2dff15bf788c671c1934e366d07e30c1814a8ef514e1af724a602e8a2fbe1b10" +dependencies = [ + "futures-core", + "futures-sink", +] + +[[package]] +name = "futures-core" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "05f29059c0c2090612e8d742178b0580d2dc940c837851ad723096f87af6663e" + +[[package]] +name = "futures-executor" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1e28d1d997f585e54aebc3f97d39e72338912123a67330d723fdbb564d646c9f" +dependencies = [ + "futures-core", + "futures-task", + "futures-util", +] + +[[package]] +name = "futures-io" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9e5c1b78ca4aae1ac06c48a526a655760685149f0d465d21f37abfe57ce075c6" + +[[package]] +name = "futures-macro" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "162ee34ebcb7c64a8abebc059ce0fee27c2262618d7b60ed8faf72fef13c3650" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "futures-sink" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e575fab7d1e0dcb8d0c7bcf9a63ee213816ab51902e6d244a95819acacf1d4f7" + +[[package]] +name = "futures-task" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f90f7dce0722e95104fcb095585910c0977252f286e354b5e3bd38902cd99988" + +[[package]] +name = "futures-util" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9fa08315bb612088cc391249efdc3bc77536f16c91f6cf495e6fbe85b20a4a81" +dependencies = [ + "futures-channel", + "futures-core", + "futures-io", + "futures-macro", + "futures-sink", + "futures-task", + "memchr", + "pin-project-lite", + "pin-utils", + "slab", +] + +[[package]] +name = "fuzzydate" +version = "0.2.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7265f35cc1f40c655aad829323a1daef5f21fd38904f6ed9bd5ec3df3cbd851c" +dependencies = [ + "chrono", + "lazy_static", + "thiserror", +] + +[[package]] +name = "fxhash" +version = "0.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c31b6d751ae2c7f11320402d34e41349dd1016f8d5d45e48c4312bc8625af50c" +dependencies = [ + "byteorder", +] + +[[package]] +name = "fxprof-processed-profile" +version = "0.6.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "27d12c0aed7f1e24276a241aadc4cb8ea9f83000f34bc062b7cc2d51e3b0fabd" +dependencies = [ + "bitflags", + "debugid", + "fxhash", + "serde", + "serde_json", +] + +[[package]] +name = "generic-array" +version = "0.14.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "85649ca51fd72272d7821adaf274ad91c288277713d9c18820d8499a7ff69e9a" +dependencies = [ + "typenum", + "version_check", +] + +[[package]] +name = "gethostname" +version = "0.4.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0176e0459c2e4a1fe232f984bca6890e681076abb9934f6cea7c326f3fc47818" +dependencies = [ + "libc", + "windows-targets 0.48.5", +] + +[[package]] +name = "getrandom" +version = "0.2.15" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c4567c8db10ae91089c99af84c68c38da3ec2f087c3f82960bcdbf3656b6f4d7" +dependencies = [ + "cfg-if", + "libc", + "wasi", +] + +[[package]] +name = "gimli" +version = "0.31.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "07e28edb80900c19c28f1072f2e8aeca7fa06b23cd4169cefe1af5aa3260783f" +dependencies = [ + "fallible-iterator", + "indexmap", + "stable_deref_trait", +] + +[[package]] +name = "glob" +version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d2fabcfbdc87f4758337ca535fb41a6d701b65693ce38287d856d1674551ec9b" + +[[package]] +name = "globset" +version = "0.4.15" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "15f1ce686646e7f1e19bf7d5533fe443a45dbfb990e00629110797578b42fb19" +dependencies = [ + "aho-corasick", + "bstr", + "log", + "regex-automata 0.4.9", + "regex-syntax 0.8.5", +] + +[[package]] +name = "h2" +version = "0.4.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ccae279728d634d083c00f6099cb58f01cc99c145b84b8be2f6c74618d79922e" +dependencies = [ + "atomic-waker", + "bytes", + "fnv", + "futures-core", + "futures-sink", + "http", + "indexmap", + "slab", + "tokio", + "tokio-util", + "tracing", +] + +[[package]] +name = "hashbrown" +version = "0.14.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e5274423e17b7c9fc20b6e7e208532f9b19825d82dfd615708b70edd83df41f1" +dependencies = [ + "ahash", + "allocator-api2", + "serde", +] + +[[package]] +name = "hashbrown" +version = "0.15.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bf151400ff0baff5465007dd2f3e717f3fe502074ca563069ce3a6629d07b289" +dependencies = [ + "allocator-api2", + "equivalent", + "foldhash", +] + +[[package]] +name = "heck" +version = "0.4.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "95505c38b4572b2d910cecb0281560f54b440a19336cbbcb27bf6ce6adc6f5a8" + +[[package]] +name = "heck" +version = "0.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2304e00983f87ffb38b55b444b5e3b60a884b5d30c0fca7d82fe33449bbe55ea" + +[[package]] +name = "hermit-abi" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fbf6a919d6cf397374f7dfeeea91d974c7c0a7221d0d0f4f20d859d329e53fcc" + +[[package]] +name = "http" +version = "1.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f16ca2af56261c99fba8bac40a10251ce8188205a4c448fbb745a2e4daa76fea" +dependencies = [ + "bytes", + "fnv", + "itoa", +] + +[[package]] +name = "http-body" +version = "1.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1efedce1fb8e6913f23e0c92de8e62cd5b772a67e7b3946df930a62566c93184" +dependencies = [ + "bytes", + "http", +] + +[[package]] +name = "http-body-util" +version = "0.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "793429d76616a256bcb62c2a2ec2bed781c8307e797e2598c50010f2bee2544f" +dependencies = [ + "bytes", + "futures-util", + "http", + "http-body", + "pin-project-lite", +] + +[[package]] +name = "httparse" +version = "1.9.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7d71d3574edd2771538b901e6549113b4006ece66150fb69c0fb6d9a2adae946" + +[[package]] +name = "humantime" +version = "2.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9a3a5bfb195931eeb336b2a7b4d761daec841b97f947d34394601737a7bba5e4" + +[[package]] +name = "hyper" +version = "1.5.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "97818827ef4f364230e16705d4706e2897df2bb60617d6ca15d598025a3c481f" +dependencies = [ + "bytes", + "futures-channel", + "futures-util", + "h2", + "http", + "http-body", + "httparse", + "itoa", + "pin-project-lite", + "smallvec", + "tokio", + "want", +] + +[[package]] +name = "hyper-rustls" +version = "0.27.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "08afdbb5c31130e3034af566421053ab03787c640246a446327f550d11bcb333" +dependencies = [ + "futures-util", + "http", + "hyper", + "hyper-util", + "rustls", + "rustls-pki-types", + "tokio", + "tokio-rustls", + "tower-service", +] + +[[package]] +name = "hyper-tls" +version = "0.6.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "70206fc6890eaca9fde8a0bf71caa2ddfc9fe045ac9e5c70df101a7dbde866e0" +dependencies = [ + "bytes", + "http-body-util", + "hyper", + "hyper-util", + "native-tls", + "tokio", + "tokio-native-tls", + "tower-service", +] + +[[package]] +name = "hyper-util" +version = "0.1.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "df2dcfbe0677734ab2f3ffa7fa7bfd4706bfdc1ef393f2ee30184aed67e631b4" +dependencies = [ + "bytes", + "futures-channel", + "futures-util", + "http", + "http-body", + "hyper", + "pin-project-lite", + "socket2", + "tokio", + "tower-service", + "tracing", +] + +[[package]] +name = "iana-time-zone" +version = "0.1.61" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "235e081f3925a06703c2d0117ea8b91f042756fd6e7a6e5d901e8ca1a996b220" +dependencies = [ + "android_system_properties", + "core-foundation-sys", + "iana-time-zone-haiku", + "js-sys", + "wasm-bindgen", + "windows-core", +] + +[[package]] +name = "iana-time-zone-haiku" +version = "0.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f31827a206f56af32e590ba56d5d2d085f558508192593743f16b2306495269f" +dependencies = [ + "cc", +] + +[[package]] +name = "icu_collections" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "db2fa452206ebee18c4b5c2274dbf1de17008e874b4dc4f0aea9d01ca79e4526" +dependencies = [ + "displaydoc", + "yoke", + "zerofrom", + "zerovec", +] + +[[package]] +name = "icu_locid" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "13acbb8371917fc971be86fc8057c41a64b521c184808a698c02acc242dbf637" +dependencies = [ + "displaydoc", + "litemap", + "tinystr", + "writeable", + "zerovec", +] + +[[package]] +name = "icu_locid_transform" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "01d11ac35de8e40fdeda00d9e1e9d92525f3f9d887cdd7aa81d727596788b54e" +dependencies = [ + "displaydoc", + "icu_locid", + "icu_locid_transform_data", + "icu_provider", + "tinystr", + "zerovec", +] + +[[package]] +name = "icu_locid_transform_data" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fdc8ff3388f852bede6b579ad4e978ab004f139284d7b28715f773507b946f6e" + +[[package]] +name = "icu_normalizer" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "19ce3e0da2ec68599d193c93d088142efd7f9c5d6fc9b803774855747dc6a84f" +dependencies = [ + "displaydoc", + "icu_collections", + "icu_normalizer_data", + "icu_properties", + "icu_provider", + "smallvec", + "utf16_iter", + "utf8_iter", + "write16", + "zerovec", +] + +[[package]] +name = "icu_normalizer_data" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f8cafbf7aa791e9b22bec55a167906f9e1215fd475cd22adfcf660e03e989516" + +[[package]] +name = "icu_properties" +version = "1.5.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "93d6020766cfc6302c15dbbc9c8778c37e62c14427cb7f6e601d849e092aeef5" +dependencies = [ + "displaydoc", + "icu_collections", + "icu_locid_transform", + "icu_properties_data", + "icu_provider", + "tinystr", + "zerovec", +] + +[[package]] +name = "icu_properties_data" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "67a8effbc3dd3e4ba1afa8ad918d5684b8868b3b26500753effea8d2eed19569" + +[[package]] +name = "icu_provider" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6ed421c8a8ef78d3e2dbc98a973be2f3770cb42b606e3ab18d6237c4dfde68d9" +dependencies = [ + "displaydoc", + "icu_locid", + "icu_provider_macros", + "stable_deref_trait", + "tinystr", + "writeable", + "yoke", + "zerofrom", + "zerovec", +] + +[[package]] +name = "icu_provider_macros" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1ec89e9337638ecdc08744df490b221a7399bf8d164eb52a665454e60e075ad6" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "id-arena" +version = "2.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "25a2bc672d1148e28034f176e01fffebb08b35768468cc954630da77a1449005" + +[[package]] +name = "ident_case" +version = "1.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b9e0384b61958566e926dc50660321d12159025e767c18e043daf26b70104c39" + +[[package]] +name = "idna" +version = "1.0.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "686f825264d630750a544639377bae737628043f20d38bbc029e8f29ea968a7e" +dependencies = [ + "idna_adapter", + "smallvec", + "utf8_iter", +] + +[[package]] +name = "idna_adapter" +version = "1.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "daca1df1c957320b2cf139ac61e7bd64fed304c5040df000a745aa1de3b4ef71" +dependencies = [ + "icu_normalizer", + "icu_properties", +] + +[[package]] +name = "indexmap" +version = "2.7.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "62f822373a4fe84d4bb149bf54e584a7f4abec90e072ed49cda0edea5b95471f" +dependencies = [ + "equivalent", + "hashbrown 0.15.2", + "serde", +] + +[[package]] +name = "indicatif" +version = "0.17.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cbf675b85ed934d3c67b5c5469701eec7db22689d0a2139d856e0925fa28b281" +dependencies = [ + "console", + "number_prefix", + "portable-atomic", + "tokio", + "unicode-width 0.2.0", + "web-time", +] + +[[package]] +name = "indoc" +version = "2.0.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b248f5224d1d606005e02c97f5aa4e88eeb230488bcc03bc9ca4d7991399f2b5" + +[[package]] +name = "insta" +version = "1.41.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7e9ffc4d4892617c50a928c52b2961cb5174b6fc6ebf252b2fac9d21955c48b8" +dependencies = [ + "console", + "globset", + "lazy_static", + "linked-hash-map", + "regex", + "similar", + "walkdir", +] + +[[package]] +name = "instability" +version = "0.3.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b829f37dead9dc39df40c2d3376c179fdfd2ac771f53f55d3c30dc096a3c0c6e" +dependencies = [ + "darling", + "indoc", + "pretty_assertions", + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "io-extras" +version = "0.18.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2285ddfe3054097ef4b2fe909ef8c3bcd1ea52a8f0d274416caebeef39f04a65" +dependencies = [ + "io-lifetimes", + "windows-sys 0.59.0", +] + +[[package]] +name = "io-lifetimes" +version = "2.0.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "06432fb54d3be7964ecd3649233cddf80db2832f47fec34c01f65b3d9d774983" + +[[package]] +name = "ipnet" +version = "2.10.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ddc24109865250148c2e0f3d25d4f0f479571723792d3802153c60922a4fb708" + +[[package]] +name = "is-docker" +version = "0.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "928bae27f42bc99b60d9ac7334e3a21d10ad8f1835a4e12ec3ec0464765ed1b3" +dependencies = [ + "once_cell", +] + +[[package]] +name = "is-wsl" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "173609498df190136aa7dea1a91db051746d339e18476eed5ca40521f02d7aa5" +dependencies = [ + "is-docker", + "once_cell", +] + +[[package]] +name = "is_terminal_polyfill" +version = "1.70.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7943c866cc5cd64cbc25b2e01621d07fa8eb2a1a23160ee81ce38704e97b8ecf" + +[[package]] +name = "itertools" +version = "0.12.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ba291022dbbd398a455acf126c1e341954079855bc60dfdda641363bd6922569" +dependencies = [ + "either", +] + +[[package]] +name = "itertools" +version = "0.13.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "413ee7dfc52ee1a4949ceeb7dbc8a33f2d6c088194d9f922fb8318faf1f01186" +dependencies = [ + "either", +] + +[[package]] +name = "itoa" +version = "1.0.14" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d75a2a4b1b190afb6f5425f10f6a8f959d2ea0b9c2b1d79553551850539e4674" + +[[package]] +name = "ittapi" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6b996fe614c41395cdaedf3cf408a9534851090959d90d54a535f675550b64b1" +dependencies = [ + "anyhow", + "ittapi-sys", + "log", +] + +[[package]] +name = "ittapi-sys" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "52f5385394064fa2c886205dba02598013ce83d3e92d33dbdc0c52fe0e7bf4fc" +dependencies = [ + "cc", +] + +[[package]] +name = "jobserver" +version = "0.1.32" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "48d1dbcbbeb6a7fec7e059840aa538bd62aaccf972c7346c4d9d2059312853d0" +dependencies = [ + "libc", +] + +[[package]] +name = "js-sys" +version = "0.3.76" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6717b6b5b077764fb5966237269cb3c64edddde4b14ce42647430a78ced9e7b7" +dependencies = [ + "once_cell", + "wasm-bindgen", +] + +[[package]] +name = "lazy-bytes-cast" +version = "5.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "10257499f089cd156ad82d0a9cd57d9501fa2c989068992a97eb3c27836f206b" + +[[package]] +name = "lazy_static" +version = "1.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bbd2bcb4c963f2ddae06a2efc7e9f3591312473c50c6685e1f298068316e66fe" + +[[package]] +name = "leb128" +version = "0.2.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "884e2677b40cc8c339eaefcb701c32ef1fd2493d71118dc0ca4b6a736c93bd67" + +[[package]] +name = "libc" +version = "0.2.168" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5aaeb2981e0606ca11d79718f8bb01164f1d6ed75080182d3abf017e6d244b6d" + +[[package]] +name = "libflate" +version = "2.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "45d9dfdc14ea4ef0900c1cddbc8dcd553fbaacd8a4a282cf4018ae9dd04fb21e" +dependencies = [ + "adler32", + "core2", + "crc32fast", + "dary_heap", + "libflate_lz77", +] + +[[package]] +name = "libflate_lz77" +version = "2.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e6e0d73b369f386f1c44abd9c570d5318f55ccde816ff4b562fa452e5182863d" +dependencies = [ + "core2", + "hashbrown 0.14.5", + "rle-decode-fast", +] + +[[package]] +name = "libloading" +version = "0.8.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fc2f4eb4bc735547cfed7c0a4922cbd04a4655978c09b54f1f7b228750664c34" +dependencies = [ + "cfg-if", + "windows-targets 0.52.6", +] + +[[package]] +name = "libm" +version = "0.2.11" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8355be11b20d696c8f18f6cc018c4e372165b1fa8126cef092399c9951984ffa" + +[[package]] +name = "libredox" +version = "0.1.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c0ff37bd590ca25063e35af745c343cb7a0271906fb7b37e4813e8f79f00268d" +dependencies = [ + "bitflags", + "libc", +] + +[[package]] +name = "libz-sys" +version = "1.1.20" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d2d16453e800a8cf6dd2fc3eb4bc99b786a9b90c663b8559a5b1a041bf89e472" +dependencies = [ + "cc", + "libc", + "pkg-config", + "vcpkg", +] + +[[package]] +name = "linked-hash-map" +version = "0.5.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0717cef1bc8b636c6e1c1bbdefc09e6322da8a9321966e8928ef80d20f7f770f" + +[[package]] +name = "linux-raw-sys" +version = "0.4.14" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "78b3ae25bc7c8c38cec158d1f2757ee79e9b3740fbc7ccf0e59e4b08d793fa89" + +[[package]] +name = "litemap" +version = "0.7.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4ee93343901ab17bd981295f2cf0026d4ad018c7c31ba84549a4ddbb47a45104" + +[[package]] +name = "lock_api" +version = "0.4.12" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "07af8b9cdd281b7915f413fa73f29ebd5d55d0d3f0155584dade1ff18cea1b17" +dependencies = [ + "autocfg", + "scopeguard", +] + +[[package]] +name = "log" +version = "0.4.22" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a7a70ba024b9dc04c27ea2f0c0548feb474ec5c54bba33a7f72f873a39d07b24" + +[[package]] +name = "lru" +version = "0.12.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "234cf4f4a04dc1f57e24b96cc0cd600cf2af460d4161ac5ecdd0af8e1f3b2a38" +dependencies = [ + "hashbrown 0.15.2", +] + +[[package]] +name = "lz4-sys" +version = "1.11.1+lz4-1.10.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6bd8c0d6c6ed0cd30b3652886bb8711dc4bb01d637a68105a3d5158039b418e6" +dependencies = [ + "cc", + "libc", +] + +[[package]] +name = "mach2" +version = "0.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "19b955cdeb2a02b9117f121ce63aa52d08ade45de53e48fe6a38b39c10f6f709" +dependencies = [ + "libc", +] + +[[package]] +name = "malloc_buf" +version = "0.0.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "62bb907fe88d54d8d9ce32a3cceab4218ed2f6b7d35617cafe9adf84e43919cb" +dependencies = [ + "libc", +] + +[[package]] +name = "manyhow" +version = "0.11.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b33efb3ca6d3b07393750d4030418d594ab1139cee518f0dc88db70fec873587" +dependencies = [ + "manyhow-macros", + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "manyhow-macros" +version = "0.11.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "46fce34d199b78b6e6073abf984c9cf5fd3e9330145a93ee0738a7443e371495" +dependencies = [ + "proc-macro-utils", + "proc-macro2", + "quote", +] + +[[package]] +name = "matchers" +version = "0.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8263075bb86c5a1b1427b5ae862e8889656f126e9f77c484496e8b47cf5c5558" +dependencies = [ + "regex-automata 0.1.10", +] + +[[package]] +name = "maybe-owned" +version = "0.3.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4facc753ae494aeb6e3c22f839b158aebd4f9270f55cd3c79906c45476c47ab4" + +[[package]] +name = "memchr" +version = "2.7.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "78ca9ab1a0babb1e7d5695e3530886289c18cf2f87ec19a575a0abdce112e3a3" + +[[package]] +name = "memfd" +version = "0.6.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b2cffa4ad52c6f791f4f8b15f0c05f9824b2ced1160e88cc393d64fff9a8ac64" +dependencies = [ + "rustix", +] + +[[package]] +name = "memmap2" +version = "0.9.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fd3f7eed9d3848f8b98834af67102b720745c4ec028fcd0aa0239277e7de374f" +dependencies = [ + "libc", +] + +[[package]] +name = "mime" +version = "0.3.17" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6877bb514081ee2a7ff5ef9de3281f14a4dd4bceac4c09388074a6b5df8a139a" + +[[package]] +name = "miniz_oxide" +version = "0.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e2d80299ef12ff69b16a84bb182e3b9df68b5a91574d3d4fa6e41b65deec4df1" +dependencies = [ + "adler2", +] + +[[package]] +name = "mio" +version = "1.0.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2886843bf800fba2e3377cff24abf6379b4c4d5c6681eaf9ea5b0d15090450bd" +dependencies = [ + "libc", + "log", + "wasi", + "windows-sys 0.52.0", +] + +[[package]] +name = "native-tls" +version = "0.2.12" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a8614eb2c83d59d1c8cc974dd3f920198647674a0a035e1af1fa58707e317466" +dependencies = [ + "libc", + "log", + "openssl", + "openssl-probe", + "openssl-sys", + "schannel", + "security-framework", + "security-framework-sys", + "tempfile", +] + +[[package]] +name = "nom" +version = "8.0.0-beta.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "842c447b3ca1221b1c26a101f24c092320f8014f40729356fbcba967afb0c1f2" +dependencies = [ + "memchr", +] + +[[package]] +name = "nu-ansi-term" +version = "0.46.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "77a8165726e8236064dbb45459242600304b42a5ea24ee2948e18e023bf7ba84" +dependencies = [ + "overload", + "winapi", +] + +[[package]] +name = "num-bigint" +version = "0.4.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a5e44f723f1133c9deac646763579fdb3ac745e418f2a7af9cd0c431da1f20b9" +dependencies = [ + "num-integer", + "num-traits", + "serde", +] + +[[package]] +name = "num-integer" +version = "0.1.46" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7969661fd2958a5cb096e56c8e1ad0444ac2bbcd0061bd28660485a44879858f" +dependencies = [ + "num-traits", +] + +[[package]] +name = "num-traits" +version = "0.2.19" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "071dfc062690e90b734c0b2273ce72ad0ffa95f0c74596bc250dcfd960262841" +dependencies = [ + "autocfg", +] + +[[package]] +name = "num_enum" +version = "0.7.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4e613fc340b2220f734a8595782c551f1250e969d87d3be1ae0579e8d4065179" +dependencies = [ + "num_enum_derive", +] + +[[package]] +name = "num_enum_derive" +version = "0.7.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "af1844ef2428cc3e1cb900be36181049ef3d3193c63e43026cfe202983b27a56" +dependencies = [ + "proc-macro-crate", + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "number_prefix" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "830b246a0e5f20af87141b25c173cd1b609bd7779a4617d6ec582abaf90870f3" + +[[package]] +name = "objc" +version = "0.2.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "915b1b472bc21c53464d6c8461c9d3af805ba1ef837e1cac254428f4a77177b1" +dependencies = [ + "malloc_buf", +] + +[[package]] +name = "objc-foundation" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1add1b659e36c9607c7aab864a76c7a4c2760cd0cd2e120f3fb8b952c7e22bf9" +dependencies = [ + "block", + "objc", + "objc_id", +] + +[[package]] +name = "objc_id" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c92d4ddb4bd7b50d730c215ff871754d0da6b2178849f8a2a2ab69712d0c073b" +dependencies = [ + "objc", +] + +[[package]] +name = "object" +version = "0.36.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "aedf0a2d09c573ed1d8d85b30c119153926a2b36dce0ab28322c09a117a4683e" +dependencies = [ + "crc32fast", + "hashbrown 0.15.2", + "indexmap", + "memchr", +] + +[[package]] +name = "once_cell" +version = "1.20.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1261fe7e33c73b354eab43b1273a57c8f967d0391e80353e51f764ac02cf6775" + +[[package]] +name = "open" +version = "5.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3ecd52f0b8d15c40ce4820aa251ed5de032e5d91fab27f7db2f40d42a8bdf69c" +dependencies = [ + "is-wsl", + "libc", + "pathdiff", +] + +[[package]] +name = "openssl" +version = "0.10.68" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6174bc48f102d208783c2c84bf931bb75927a617866870de8a4ea85597f871f5" +dependencies = [ + "bitflags", + "cfg-if", + "foreign-types", + "libc", + "once_cell", + "openssl-macros", + "openssl-sys", +] + +[[package]] +name = "openssl-macros" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "openssl-probe" +version = "0.1.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ff011a302c396a5197692431fc1948019154afc178baf7d8e37367442a4601cf" + +[[package]] +name = "openssl-src" +version = "300.4.1+3.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "faa4eac4138c62414b5622d1b31c5c304f34b406b013c079c2bbc652fdd6678c" +dependencies = [ + "cc", +] + +[[package]] +name = "openssl-sys" +version = "0.9.104" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "45abf306cbf99debc8195b66b7346498d7b10c210de50418b5ccd7ceba08c741" +dependencies = [ + "cc", + "libc", + "openssl-src", + "pkg-config", + "vcpkg", +] + +[[package]] +name = "option-ext" +version = "0.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "04744f49eae99ab78e0d5c0b603ab218f515ea8cfe5a456d7629ad883a3b6e7d" + +[[package]] +name = "overload" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39" + +[[package]] +name = "parking_lot" +version = "0.12.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f1bf18183cf54e8d6059647fc3063646a1801cf30896933ec2311622cc4b9a27" +dependencies = [ + "lock_api", + "parking_lot_core", +] + +[[package]] +name = "parking_lot_core" +version = "0.9.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1e401f977ab385c9e4e3ab30627d6f26d00e2c73eef317493c4ec6d468726cf8" +dependencies = [ + "cfg-if", + "libc", + "redox_syscall", + "smallvec", + "windows-targets 0.52.6", +] + +[[package]] +name = "paste" +version = "1.0.15" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "57c0d7b74b563b49d38dae00a0c37d4d6de9b432382b2892f0574ddcae73fd0a" + +[[package]] +name = "pathdiff" +version = "0.2.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "df94ce210e5bc13cb6651479fa48d14f601d9858cfe0467f43ae157023b938d3" + +[[package]] +name = "percent-encoding" +version = "2.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e3148f5046208a5d56bcfc03053e3ca6334e51da8dfb19b6cdc8b306fae3283e" + +[[package]] +name = "pin-project-lite" +version = "0.2.15" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "915a1e146535de9163f3987b8944ed8cf49a18bb0056bcebcdcece385cece4ff" + +[[package]] +name = "pin-utils" +version = "0.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8b870d8c151b6f2fb93e84a13146138f05d02ed11c7e7c54f8826aaaf7c9f184" + +[[package]] +name = "pkg-config" +version = "0.3.31" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "953ec861398dccce10c670dfeaf3ec4911ca479e9c02154b3a215178c5f566f2" + +[[package]] +name = "polling" +version = "3.7.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a604568c3202727d1507653cb121dbd627a58684eb09a820fd746bee38b4442f" +dependencies = [ + "cfg-if", + "concurrent-queue", + "hermit-abi", + "pin-project-lite", + "rustix", + "tracing", + "windows-sys 0.59.0", +] + +[[package]] +name = "portable-atomic" +version = "1.10.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "280dc24453071f1b63954171985a0b0d30058d287960968b9b2aca264c8d4ee6" + +[[package]] +name = "postcard" +version = "1.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "170a2601f67cc9dba8edd8c4870b15f71a6a2dc196daec8c83f72b59dff628a8" +dependencies = [ + "cobs", + "embedded-io 0.4.0", + "embedded-io 0.6.1", + "serde", +] + +[[package]] +name = "ppv-lite86" +version = "0.2.20" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "77957b295656769bb8ad2b6a6b09d897d94f05c41b069aede1fcdaa675eaea04" +dependencies = [ + "zerocopy", +] + +[[package]] +name = "pretty_assertions" +version = "1.4.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3ae130e2f271fbc2ac3a40fb1d07180839cdbbe443c7a27e1e3c13c5cac0116d" +dependencies = [ + "diff", + "yansi", +] + +[[package]] +name = "proc-macro-crate" +version = "3.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8ecf48c7ca261d60b74ab1a7b20da18bede46776b2e55535cb958eb595c5fa7b" +dependencies = [ + "toml_edit", +] + +[[package]] +name = "proc-macro-utils" +version = "0.10.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "eeaf08a13de400bc215877b5bdc088f241b12eb42f0a548d3390dc1c56bb7071" +dependencies = [ + "proc-macro2", + "quote", + "smallvec", +] + +[[package]] +name = "proc-macro2" +version = "1.0.92" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "37d3544b3f2748c54e147655edb5025752e2303145b5aefb3c3ea2c78b973bb0" +dependencies = [ + "unicode-ident", +] + +[[package]] +name = "prost" +version = "0.13.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2c0fef6c4230e4ccf618a35c59d7ede15dea37de8427500f50aff708806e42ec" +dependencies = [ + "bytes", + "prost-derive", +] + +[[package]] +name = "prost-derive" +version = "0.13.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "157c5a9d7ea5c2ed2d9fb8f495b64759f7816c7eaea54ba3978f0d63000162e3" +dependencies = [ + "anyhow", + "itertools 0.13.0", + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "protobuf" +version = "3.7.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a3a7c64d9bf75b1b8d981124c14c179074e8caa7dfe7b6a12e6222ddcd0c8f72" +dependencies = [ + "once_cell", + "protobuf-support", + "thiserror", +] + +[[package]] +name = "protobuf-support" +version = "3.7.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b088fd20b938a875ea00843b6faf48579462630015c3788d397ad6a786663252" +dependencies = [ + "thiserror", +] + +[[package]] +name = "psm" +version = "0.1.24" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "200b9ff220857e53e184257720a14553b2f4aa02577d2ed9842d45d4b9654810" +dependencies = [ + "cc", +] + +[[package]] +name = "pulley-interpreter" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "df33e7f8a43ccc7f93b330fef4baf271764674926f3f4d40f4a196d54de8af26" +dependencies = [ + "cranelift-bitset", + "log", + "sptr", +] + +[[package]] +name = "quad-rand" +version = "0.2.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5a651516ddc9168ebd67b24afd085a718be02f8858fe406591b013d101ce2f40" + +[[package]] +name = "quick-xml" +version = "0.36.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f7649a7b4df05aed9ea7ec6f628c67c9953a43869b8bc50929569b2999d443fe" +dependencies = [ + "memchr", +] + +[[package]] +name = "quote" +version = "1.0.37" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b5b9d34b8991d19d98081b46eacdd8eb58c6f2b201139f7c5f643cc155a633af" +dependencies = [ + "proc-macro2", +] + +[[package]] +name = "rand" +version = "0.8.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "34af8d1a0e25924bc5b7c43c079c942339d8f0a8b57c39049bef581b46327404" +dependencies = [ + "libc", + "rand_chacha", + "rand_core", +] + +[[package]] +name = "rand_chacha" +version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e6c10a63a0fa32252be49d21e7709d4d4baf8d231c2dbce1eaa8141b9b127d88" +dependencies = [ + "ppv-lite86", + "rand_core", +] + +[[package]] +name = "rand_core" +version = "0.6.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ec0be4795e2f6a28069bec0b5ff3e2ac9bafc99e6a9a7dc3547996c5c816922c" +dependencies = [ + "getrandom", +] + +[[package]] +name = "ratatui" +version = "0.29.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "eabd94c2f37801c20583fc49dd5cd6b0ba68c716787c2dd6ed18571e1e63117b" +dependencies = [ + "bitflags", + "cassowary", + "compact_str", + "crossterm", + "indoc", + "instability", + "itertools 0.13.0", + "lru", + "paste", + "serde", + "strum", + "unicode-segmentation", + "unicode-truncate", + "unicode-width 0.2.0", +] + +[[package]] +name = "rayon" +version = "1.10.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b418a60154510ca1a002a752ca9714984e21e4241e804d32555251faf8b78ffa" +dependencies = [ + "either", + "rayon-core", +] + +[[package]] +name = "rayon-core" +version = "1.12.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1465873a3dfdaa8ae7cb14b4383657caab0b3e8a0aa9ae8e04b044854c8dfce2" +dependencies = [ + "crossbeam-deque", + "crossbeam-utils", +] + +[[package]] +name = "rdkafka" +version = "0.37.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "14b52c81ac3cac39c9639b95c20452076e74b8d9a71bc6fc4d83407af2ea6fff" +dependencies = [ + "futures-channel", + "futures-util", + "libc", + "log", + "rdkafka-sys", + "serde", + "serde_derive", + "serde_json", + "slab", + "tokio", +] + +[[package]] +name = "rdkafka-sys" +version = "4.8.0+2.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ced38182dc436b3d9df0c77976f37a67134df26b050df1f0006688e46fc4c8be" +dependencies = [ + "cmake", + "curl-sys", + "libc", + "libz-sys", + "lz4-sys", + "num_enum", + "openssl-sys", + "pkg-config", + "zstd-sys", +] + +[[package]] +name = "redox_syscall" +version = "0.5.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9b6dfecf2c74bce2466cabf93f6664d6998a69eb21e39f4207930065b27b771f" +dependencies = [ + "bitflags", +] + +[[package]] +name = "redox_users" +version = "0.4.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ba009ff324d1fc1b900bd1fdb31564febe58a8ccc8a6fdbb93b543d33b13ca43" +dependencies = [ + "getrandom", + "libredox", + "thiserror", +] + +[[package]] +name = "regalloc2" +version = "0.10.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "12908dbeb234370af84d0579b9f68258a0f67e201412dd9a2814e6f45b2fc0f0" +dependencies = [ + "hashbrown 0.14.5", + "log", + "rustc-hash", + "slice-group-by", + "smallvec", +] + +[[package]] +name = "regex" +version = "1.11.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b544ef1b4eac5dc2db33ea63606ae9ffcfac26c1416a2806ae0bf5f56b201191" +dependencies = [ + "aho-corasick", + "memchr", + "regex-automata 0.4.9", + "regex-syntax 0.8.5", +] + +[[package]] +name = "regex-automata" +version = "0.1.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6c230d73fb8d8c1b9c0b3135c5142a8acee3a0558fb8db5cf1cb65f8d7862132" +dependencies = [ + "regex-syntax 0.6.29", +] + +[[package]] +name = "regex-automata" +version = "0.4.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "809e8dc61f6de73b46c85f4c96486310fe304c434cfa43669d7b40f711150908" +dependencies = [ + "aho-corasick", + "memchr", + "regex-syntax 0.8.5", +] + +[[package]] +name = "regex-lite" +version = "0.1.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "53a49587ad06b26609c52e423de037e7f57f20d53535d66e08c695f347df952a" + +[[package]] +name = "regex-syntax" +version = "0.6.29" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f162c6dd7b008981e4d40210aca20b4bd0f9b60ca9271061b07f78537722f2e1" + +[[package]] +name = "regex-syntax" +version = "0.8.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2b15c43186be67a4fd63bee50d0303afffcef381492ebe2c5d87f324e1b8815c" + +[[package]] +name = "reqwest" +version = "0.12.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a77c62af46e79de0a562e1a9849205ffcb7fc1238876e9bd743357570e04046f" +dependencies = [ + "base64 0.22.1", + "bytes", + "encoding_rs", + "futures-core", + "futures-util", + "h2", + "http", + "http-body", + "http-body-util", + "hyper", + "hyper-rustls", + "hyper-tls", + "hyper-util", + "ipnet", + "js-sys", + "log", + "mime", + "native-tls", + "once_cell", + "percent-encoding", + "pin-project-lite", + "rustls-pemfile", + "serde", + "serde_json", + "serde_urlencoded", + "sync_wrapper", + "system-configuration", + "tokio", + "tokio-native-tls", + "tower-service", + "url", + "wasm-bindgen", + "wasm-bindgen-futures", + "web-sys", + "windows-registry", +] + +[[package]] +name = "ring" +version = "0.17.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c17fa4cb658e3583423e915b9f3acc01cceaee1860e33d59ebae66adc3a2dc0d" +dependencies = [ + "cc", + "cfg-if", + "getrandom", + "libc", + "spin", + "untrusted", + "windows-sys 0.52.0", +] + +[[package]] +name = "rle-decode-fast" +version = "1.0.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3582f63211428f83597b51b2ddb88e2a91a9d52d12831f9d08f5e624e8977422" + +[[package]] +name = "rmp" +version = "0.8.14" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "228ed7c16fa39782c3b3468e974aec2795e9089153cd08ee2e9aefb3613334c4" +dependencies = [ + "byteorder", + "num-traits", + "paste", +] + +[[package]] +name = "rmp-serde" +version = "1.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "52e599a477cf9840e92f2cde9a7189e67b42c57532749bf90aea6ec10facd4db" +dependencies = [ + "byteorder", + "rmp", + "serde", +] + +[[package]] +name = "rustc-demangle" +version = "0.1.24" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "719b953e2095829ee67db738b3bfa9fa368c94900df327b3f07fe6e794d2fe1f" + +[[package]] +name = "rustc-hash" +version = "2.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c7fb8039b3032c191086b10f11f319a6e99e1e82889c5cc6046f515c9db1d497" + +[[package]] +name = "rustix" +version = "0.38.42" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f93dc38ecbab2eb790ff964bb77fa94faf256fd3e73285fd7ba0903b76bedb85" +dependencies = [ + "bitflags", + "errno", + "itoa", + "libc", + "linux-raw-sys", + "once_cell", + "windows-sys 0.59.0", +] + +[[package]] +name = "rustls" +version = "0.23.19" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "934b404430bb06b3fae2cba809eb45a1ab1aecd64491213d7c3301b88393f8d1" +dependencies = [ + "log", + "once_cell", + "ring", + "rustls-pki-types", + "rustls-webpki", + "subtle", + "zeroize", +] + +[[package]] +name = "rustls-pemfile" +version = "2.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "dce314e5fee3f39953d46bb63bb8a46d40c2f8fb7cc5a3b6cab2bde9721d6e50" +dependencies = [ + "rustls-pki-types", +] + +[[package]] +name = "rustls-pki-types" +version = "1.10.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "16f1201b3c9a7ee8039bcadc17b7e605e2945b27eee7631788c1bd2b0643674b" + +[[package]] +name = "rustls-webpki" +version = "0.102.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "64ca1bc8749bd4cf37b5ce386cc146580777b4e8572c7b97baf22c83f444bee9" +dependencies = [ + "ring", + "rustls-pki-types", + "untrusted", +] + +[[package]] +name = "rustversion" +version = "1.0.18" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0e819f2bc632f285be6d7cd36e25940d45b2391dd6d9b939e79de557f7014248" + +[[package]] +name = "ryu" +version = "1.0.18" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f3cb5ba0dc43242ce17de99c180e96db90b235b8a9fdc9543c96d2209116bd9f" + +[[package]] +name = "same-file" +version = "1.0.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "93fc1dc3aaa9bfed95e02e6eadabb4baf7e3078b0bd1b4d7b6b0b68378900502" +dependencies = [ + "winapi-util", +] + +[[package]] +name = "schannel" +version = "0.1.27" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1f29ebaa345f945cec9fbbc532eb307f0fdad8161f281b6369539c8d84876b3d" +dependencies = [ + "windows-sys 0.59.0", +] + +[[package]] +name = "scoped-tls" +version = "1.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e1cf6437eb19a8f4a6cc0f7dca544973b0b78843adbfeb3683d1a94a0024a294" + +[[package]] +name = "scopeguard" +version = "1.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49" + +[[package]] +name = "security-framework" +version = "2.11.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "897b2245f0b511c87893af39b033e5ca9cce68824c4d7e7630b5a1d339658d02" +dependencies = [ + "bitflags", + "core-foundation", + "core-foundation-sys", + "libc", + "security-framework-sys", +] + +[[package]] +name = "security-framework-sys" +version = "2.12.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fa39c7303dc58b5543c94d22c1766b0d31f2ee58306363ea622b10bbc075eaa2" +dependencies = [ + "core-foundation-sys", + "libc", +] + +[[package]] +name = "semver" +version = "1.0.23" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "61697e0a1c7e512e84a621326239844a24d8207b4669b41bc18b32ea5cbf988b" +dependencies = [ + "serde", +] + +[[package]] +name = "serde" +version = "1.0.215" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6513c1ad0b11a9376da888e3e0baa0077f1aed55c17f50e7b2397136129fb88f" +dependencies = [ + "serde_derive", +] + +[[package]] +name = "serde_bytes" +version = "0.11.15" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "387cc504cb06bb40a96c8e04e951fe01854cf6bc921053c954e4a606d9675c6a" +dependencies = [ + "serde", +] + +[[package]] +name = "serde_derive" +version = "1.0.215" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ad1e866f866923f252f05c889987993144fb74e722403468a4ebd70c3cd756c0" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "serde_json" +version = "1.0.133" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c7fceb2473b9166b2294ef05efcb65a3db80803f0b03ef86a5fc88a2b85ee377" +dependencies = [ + "indexmap", + "itoa", + "memchr", + "ryu", + "serde", +] + +[[package]] +name = "serde_spanned" +version = "0.6.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "87607cb1398ed59d48732e575a4c28a7a8ebf2454b964fe3f224f2afc07909e1" +dependencies = [ + "serde", +] + +[[package]] +name = "serde_urlencoded" +version = "0.7.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d3491c14715ca2294c4d6a88f15e84739788c1d030eed8c110436aafdaa2f3fd" +dependencies = [ + "form_urlencoded", + "itoa", + "ryu", + "serde", +] + +[[package]] +name = "sha2" +version = "0.10.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "793db75ad2bcafc3ffa7c68b215fee268f537982cd901d132f89c6343f3a3dc8" +dependencies = [ + "cfg-if", + "cpufeatures", + "digest", +] + +[[package]] +name = "sharded-slab" +version = "0.1.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f40ca3c46823713e0d4209592e8d6e826aa57e928f09752619fc696c499637f6" +dependencies = [ + "lazy_static", +] + +[[package]] +name = "shellexpand" +version = "2.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7ccc8076840c4da029af4f87e4e8daeb0fca6b87bbb02e10cb60b791450e11e4" +dependencies = [ + "dirs", +] + +[[package]] +name = "shlex" +version = "1.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0fda2ff0d084019ba4d7c6f371c95d8fd75ce3524c3cb8fb653a3023f6323e64" + +[[package]] +name = "signal-hook" +version = "0.3.17" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8621587d4798caf8eb44879d42e56b9a93ea5dcd315a6487c357130095b62801" +dependencies = [ + "libc", + "signal-hook-registry", +] + +[[package]] +name = "signal-hook-mio" +version = "0.2.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "34db1a06d485c9142248b7a054f034b349b212551f3dfd19c94d45a754a217cd" +dependencies = [ + "libc", + "mio", + "signal-hook", +] + +[[package]] +name = "signal-hook-registry" +version = "1.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a9e9e0b4211b72e7b8b6e85c807d36c212bdb33ea8587f7569562a84df5465b1" +dependencies = [ + "libc", +] + +[[package]] +name = "similar" +version = "2.6.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1de1d4f81173b03af4c0cbed3c898f6bff5b870e4a7f5d6f4057d62a7a4b686e" + +[[package]] +name = "slab" +version = "0.4.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8f92a496fb766b417c996b9c5e57daf2f7ad3b0bebe1ccfca4856390e3d3bb67" +dependencies = [ + "autocfg", +] + +[[package]] +name = "slice-group-by" +version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "826167069c09b99d56f31e9ae5c99049e932a98c9dc2dac47645b08dbbf76ba7" + +[[package]] +name = "smallvec" +version = "1.13.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3c5e1a9a646d36c3599cd173a41282daf47c44583ad367b8e6837255952e5c67" +dependencies = [ + "serde", +] + +[[package]] +name = "smithay-client-toolkit" +version = "0.19.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3457dea1f0eb631b4034d61d4d8c32074caa6cd1ab2d59f2327bd8461e2c0016" +dependencies = [ + "bitflags", + "calloop", + "calloop-wayland-source", + "cursor-icon", + "libc", + "log", + "memmap2", + "rustix", + "thiserror", + "wayland-backend", + "wayland-client", + "wayland-csd-frame", + "wayland-cursor", + "wayland-protocols", + "wayland-protocols-wlr", + "wayland-scanner", + "xkeysym", +] + +[[package]] +name = "smithay-clipboard" +version = "0.7.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cc8216eec463674a0e90f29e0ae41a4db573ec5b56b1c6c1c71615d249b6d846" +dependencies = [ + "libc", + "smithay-client-toolkit", + "wayland-backend", +] + +[[package]] +name = "socket2" +version = "0.5.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c970269d99b64e60ec3bd6ad27270092a5394c4e309314b18ae3fe575695fbe8" +dependencies = [ + "libc", + "windows-sys 0.52.0", +] + +[[package]] +name = "spin" +version = "0.9.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6980e8d7511241f8acf4aebddbb1ff938df5eebe98691418c4468d0b72a96a67" + +[[package]] +name = "sptr" +version = "0.3.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3b9b39299b249ad65f3b7e96443bad61c02ca5cd3589f46cb6d610a0fd6c0d6a" + +[[package]] +name = "stable_deref_trait" +version = "1.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a8f112729512f8e442d81f95a8a7ddf2b7c6b8a1a6f509a95864142b30cab2d3" + +[[package]] +name = "static_assertions" +version = "1.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a2eb9349b6444b326872e140eb1cf5e7c522154d69e7a0ffb0fb81c06b37543f" + +[[package]] +name = "strsim" +version = "0.11.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7da8b5736845d9f2fcb837ea5d9e2628564b3b043a70948a3f0b778838c5fb4f" + +[[package]] +name = "strum" +version = "0.26.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8fec0f0aef304996cf250b31b5a10dee7980c85da9d759361292b8bca5a18f06" +dependencies = [ + "strum_macros", +] + +[[package]] +name = "strum_macros" +version = "0.26.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4c6bee85a5a24955dc440386795aa378cd9cf82acd5f764469152d2270e581be" +dependencies = [ + "heck 0.5.0", + "proc-macro2", + "quote", + "rustversion", + "syn", +] + +[[package]] +name = "subtle" +version = "2.6.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "13c2bddecc57b384dee18652358fb23172facb8a2c51ccc10d74c157bdea3292" + +[[package]] +name = "syn" +version = "2.0.90" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "919d3b74a5dd0ccd15aeb8f93e7006bd9e14c295087c9896a110f490752bcf31" +dependencies = [ + "proc-macro2", + "quote", + "unicode-ident", +] + +[[package]] +name = "sync_wrapper" +version = "1.0.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0bf256ce5efdfa370213c1dabab5935a12e49f2c58d15e9eac2870d3b4f27263" +dependencies = [ + "futures-core", +] + +[[package]] +name = "synstructure" +version = "0.13.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c8af7666ab7b6390ab78131fb5b0fce11d6b7a6951602017c35fa82800708971" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "system-configuration" +version = "0.6.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3c879d448e9d986b661742763247d3693ed13609438cf3d006f51f5368a5ba6b" +dependencies = [ + "bitflags", + "core-foundation", + "system-configuration-sys", +] + +[[package]] +name = "system-configuration-sys" +version = "0.6.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8e1d1b10ced5ca923a1fcb8d03e96b8d3268065d724548c0211415ff6ac6bac4" +dependencies = [ + "core-foundation-sys", + "libc", +] + +[[package]] +name = "system-interface" +version = "0.27.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cc4592f674ce18521c2a81483873a49596655b179f71c5e05d10c1fe66c78745" +dependencies = [ + "bitflags", + "cap-fs-ext", + "cap-std", + "fd-lock", + "io-lifetimes", + "rustix", + "windows-sys 0.59.0", + "winx", +] + +[[package]] +name = "target-lexicon" +version = "0.12.16" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "61c41af27dd6d1e27b1b16b489db798443478cef1f06a660c96db617ba5de3b1" + +[[package]] +name = "tempfile" +version = "3.14.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "28cce251fcbc87fac86a866eeb0d6c2d536fc16d06f184bb61aeae11aa4cee0c" +dependencies = [ + "cfg-if", + "fastrand", + "once_cell", + "rustix", + "windows-sys 0.59.0", +] + +[[package]] +name = "termcolor" +version = "1.4.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "06794f8f6c5c898b3275aebefa6b8a1cb24cd2c6c79397ab15774837a0bc5755" +dependencies = [ + "winapi-util", +] + +[[package]] +name = "thiserror" +version = "1.0.69" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b6aaf5339b578ea85b50e080feb250a3e8ae8cfcdff9a461c9ec2904bc923f52" +dependencies = [ + "thiserror-impl", +] + +[[package]] +name = "thiserror-impl" +version = "1.0.69" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4fee6c4efc90059e10f81e6d42c60a18f76588c3d74cb83a0b242a2b6c7504c1" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "thousands" +version = "0.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3bf63baf9f5039dadc247375c29eb13706706cfde997d0330d05aa63a77d8820" + +[[package]] +name = "thread_local" +version = "1.1.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8b9ef9bad013ada3808854ceac7b46812a6465ba368859a37e2100283d2d719c" +dependencies = [ + "cfg-if", + "once_cell", +] + +[[package]] +name = "throbber-widgets-tui" +version = "0.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1d36b5738d666a2b4c91b7c24998a8588db724b3107258343ebf8824bf55b06d" +dependencies = [ + "rand", + "ratatui", +] + +[[package]] +name = "tinystr" +version = "0.7.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9117f5d4db391c1cf6927e7bea3db74b9a1c1add8f7eda9ffd5364f40f57b82f" +dependencies = [ + "displaydoc", + "zerovec", +] + +[[package]] +name = "tokio" +version = "1.42.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5cec9b21b0450273377fc97bd4c33a8acffc8c996c987a7c5b319a0083707551" +dependencies = [ + "backtrace", + "bytes", + "libc", + "mio", + "parking_lot", + "pin-project-lite", + "signal-hook-registry", + "socket2", + "tokio-macros", + "tracing", + "windows-sys 0.52.0", +] + +[[package]] +name = "tokio-macros" +version = "2.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "693d596312e88961bc67d7f1f97af8a70227d9f90c31bba5806eec004978d752" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "tokio-native-tls" +version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2" +dependencies = [ + "native-tls", + "tokio", +] + +[[package]] +name = "tokio-rustls" +version = "0.26.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5f6d0975eaace0cf0fcadee4e4aaa5da15b5c079146f2cffb67c113be122bf37" +dependencies = [ + "rustls", + "tokio", +] + +[[package]] +name = "tokio-util" +version = "0.7.13" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d7fcaa8d55a2bdd6b83ace262b016eca0d79ee02818c5c1bcdf0305114081078" +dependencies = [ + "bytes", + "futures-core", + "futures-sink", + "pin-project-lite", + "tokio", +] + +[[package]] +name = "toml" +version = "0.8.19" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a1ed1f98e3fdc28d6d910e6737ae6ab1a93bf1985935a1193e68f93eeb68d24e" +dependencies = [ + "serde", + "serde_spanned", + "toml_datetime", + "toml_edit", +] + +[[package]] +name = "toml_datetime" +version = "0.6.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0dd7358ecb8fc2f8d014bf86f6f638ce72ba252a2c3a2572f2a795f1d23efb41" +dependencies = [ + "serde", +] + +[[package]] +name = "toml_edit" +version = "0.22.22" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4ae48d6208a266e853d946088ed816055e556cc6028c5e8e2b84d9fa5dd7c7f5" +dependencies = [ + "indexmap", + "serde", + "serde_spanned", + "toml_datetime", + "winnow", +] + +[[package]] +name = "tower-service" +version = "0.3.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8df9b6e13f2d32c91b9bd719c00d1958837bc7dec474d94952798cc8e69eeec3" + +[[package]] +name = "tracing" +version = "0.1.41" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "784e0ac535deb450455cbfa28a6f0df145ea1bb7ae51b821cf5e7927fdcfbdd0" +dependencies = [ + "log", + "pin-project-lite", + "tracing-attributes", + "tracing-core", +] + +[[package]] +name = "tracing-attributes" +version = "0.1.28" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "395ae124c09f9e6918a2310af6038fba074bcf474ac352496d5910dd59a2226d" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "tracing-core" +version = "0.1.33" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e672c95779cf947c5311f83787af4fa8fffd12fb27e4993211a84bdfd9610f9c" +dependencies = [ + "once_cell", + "valuable", +] + +[[package]] +name = "tracing-log" +version = "0.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ee855f1f400bd0e5c02d150ae5de3840039a3f54b025156404e34c23c03f47c3" +dependencies = [ + "log", + "once_cell", + "tracing-core", +] + +[[package]] +name = "tracing-subscriber" +version = "0.3.19" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e8189decb5ac0fa7bc8b96b7cb9b2701d60d48805aca84a238004d665fcc4008" +dependencies = [ + "matchers", + "nu-ansi-term", + "once_cell", + "regex", + "sharded-slab", + "smallvec", + "thread_local", + "tracing", + "tracing-core", + "tracing-log", +] + +[[package]] +name = "try-lock" +version = "0.2.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e421abadd41a4225275504ea4d6566923418b7f05506fbc9c0fe86ba7396114b" + +[[package]] +name = "tui-input" +version = "0.11.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e5d1733c47f1a217b7deff18730ff7ca4ecafc5771368f715ab072d679a36114" +dependencies = [ + "ratatui", + "unicode-width 0.2.0", +] + +[[package]] +name = "typed-builder" +version = "0.19.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a06fbd5b8de54c5f7c91f6fe4cebb949be2125d7758e630bb58b1d831dbce600" +dependencies = [ + "typed-builder-macro", +] + +[[package]] +name = "typed-builder-macro" +version = "0.19.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f9534daa9fd3ed0bd911d462a37f172228077e7abf18c18a5f67199d959205f8" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "typenum" +version = "1.17.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "42ff0bf0c66b8238c6f3b578df37d0b7848e55df8577b3f74f92a69acceeb825" + +[[package]] +name = "unicode-ident" +version = "1.0.14" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "adb9e6ca4f869e1180728b7950e35922a7fc6397f7b641499e8f3ef06e50dc83" + +[[package]] +name = "unicode-segmentation" +version = "1.12.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f6ccf251212114b54433ec949fd6a7841275f9ada20dddd2f29e9ceea4501493" + +[[package]] +name = "unicode-truncate" +version = "1.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b3644627a5af5fa321c95b9b235a72fd24cd29c648c2c379431e6628655627bf" +dependencies = [ + "itertools 0.13.0", + "unicode-segmentation", + "unicode-width 0.1.14", +] + +[[package]] +name = "unicode-width" +version = "0.1.14" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7dd6e30e90baa6f72411720665d41d89b9a3d039dc45b8faea1ddd07f617f6af" + +[[package]] +name = "unicode-width" +version = "0.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1fc81956842c57dac11422a97c3b8195a1ff727f06e85c84ed2e8aa277c9a0fd" + +[[package]] +name = "unicode-xid" +version = "0.2.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853" + +[[package]] +name = "untrusted" +version = "0.9.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8ecb6da28b8a351d773b68d5825ac39017e680750f980f3a1a85cd8dd28a47c1" + +[[package]] +name = "ureq" +version = "2.12.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "02d1a66277ed75f640d608235660df48c8e3c19f3b4edb6a263315626cc3c01d" +dependencies = [ + "base64 0.22.1", + "flate2", + "log", + "once_cell", + "rustls", + "rustls-pki-types", + "url", + "webpki-roots", +] + +[[package]] +name = "url" +version = "2.5.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "32f8b686cadd1473f4bd0117a5d28d36b1ade384ea9b5069a1c40aefed7fda60" +dependencies = [ + "form_urlencoded", + "idna", + "percent-encoding", + "serde", +] + +[[package]] +name = "utf16_iter" +version = "1.0.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c8232dd3cdaed5356e0f716d285e4b40b932ac434100fe9b7e0e8e935b9e6246" + +[[package]] +name = "utf8_iter" +version = "1.0.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b6c140620e7ffbb22c2dee59cafe6084a59b5ffc27a8859a5f0d494b5d52b6be" + +[[package]] +name = "utf8parse" +version = "0.2.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "06abde3611657adf66d383f00b093d7faecc7fa57071cce2578660c9f1010821" + +[[package]] +name = "uuid" +version = "1.11.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f8c5f0a0af699448548ad1a2fbf920fb4bee257eae39953ba95cb84891a0446a" +dependencies = [ + "getrandom", + "serde", +] + +[[package]] +name = "valuable" +version = "0.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "830b7e5d4d90034032940e4ace0d9a9a057e7a45cd94e6c007832e39edb82f6d" + +[[package]] +name = "vcpkg" +version = "0.2.15" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "accd4ea62f7bb7a82fe23066fb0957d48ef677f6eeb8215f372f52e48bb32426" + +[[package]] +name = "version_check" +version = "0.9.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0b928f33d975fc6ad9f86c8f283853ad26bdd5b10b7f1542aa2fa15e2289105a" + +[[package]] +name = "walkdir" +version = "2.5.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "29790946404f91d9c5d06f9874efddea1dc06c5efe94541a7d6863108e3a5e4b" +dependencies = [ + "same-file", + "winapi-util", +] + +[[package]] +name = "want" +version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bfa7760aed19e106de2c7c0b581b509f2f25d3dacaf737cb82ac61bc6d760b0e" +dependencies = [ + "try-lock", +] + +[[package]] +name = "wasi" +version = "0.11.0+wasi-snapshot-preview1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423" + +[[package]] +name = "wasi-common" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "165a969c7b4ac223150e2819df36d58b8f24b06320dc314503f90300e5e18bc1" +dependencies = [ + "anyhow", + "bitflags", + "cap-fs-ext", + "cap-rand", + "cap-std", + "cap-time-ext", + "fs-set-times", + "io-extras", + "io-lifetimes", + "log", + "once_cell", + "rustix", + "system-interface", + "thiserror", + "tracing", + "wasmtime", + "wiggle", + "windows-sys 0.59.0", +] + +[[package]] +name = "wasm-bindgen" +version = "0.2.99" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a474f6281d1d70c17ae7aa6a613c87fce69a127e2624002df63dcb39d6cf6396" +dependencies = [ + "cfg-if", + "once_cell", + "wasm-bindgen-macro", +] + +[[package]] +name = "wasm-bindgen-backend" +version = "0.2.99" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5f89bb38646b4f81674e8f5c3fb81b562be1fd936d84320f3264486418519c79" +dependencies = [ + "bumpalo", + "log", + "proc-macro2", + "quote", + "syn", + "wasm-bindgen-shared", +] + +[[package]] +name = "wasm-bindgen-futures" +version = "0.4.49" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "38176d9b44ea84e9184eff0bc34cc167ed044f816accfe5922e54d84cf48eca2" +dependencies = [ + "cfg-if", + "js-sys", + "once_cell", + "wasm-bindgen", + "web-sys", +] + +[[package]] +name = "wasm-bindgen-macro" +version = "0.2.99" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2cc6181fd9a7492eef6fef1f33961e3695e4579b9872a6f7c83aee556666d4fe" +dependencies = [ + "quote", + "wasm-bindgen-macro-support", +] + +[[package]] +name = "wasm-bindgen-macro-support" +version = "0.2.99" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "30d7a95b763d3c45903ed6c81f156801839e5ee968bb07e534c44df0fcd330c2" +dependencies = [ + "proc-macro2", + "quote", + "syn", + "wasm-bindgen-backend", + "wasm-bindgen-shared", +] + +[[package]] +name = "wasm-bindgen-shared" +version = "0.2.99" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "943aab3fdaaa029a6e0271b35ea10b72b943135afe9bffca82384098ad0e06a6" + +[[package]] +name = "wasm-encoder" +version = "0.218.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "22b896fa8ceb71091ace9bcb81e853f54043183a1c9667cf93422c40252ffa0a" +dependencies = [ + "leb128", +] + +[[package]] +name = "wasm-encoder" +version = "0.221.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c17a3bd88f2155da63a1f2fcb8a56377a24f0b6dfed12733bb5f544e86f690c5" +dependencies = [ + "leb128", + "wasmparser 0.221.2", +] + +[[package]] +name = "wasmparser" +version = "0.218.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b09e46c7fceceaa72b2dd1a8a137ea7fd8f93dfaa69806010a709918e496c5dc" +dependencies = [ + "ahash", + "bitflags", + "hashbrown 0.14.5", + "indexmap", + "semver", + "serde", +] + +[[package]] +name = "wasmparser" +version = "0.221.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9845c470a2e10b61dd42c385839cdd6496363ed63b5c9e420b5488b77bd22083" +dependencies = [ + "bitflags", + "indexmap", + "semver", +] + +[[package]] +name = "wasmprinter" +version = "0.218.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0ace089155491837b75f474bf47c99073246d1b737393fe722d6dee311595ddc" +dependencies = [ + "anyhow", + "termcolor", + "wasmparser 0.218.0", +] + +[[package]] +name = "wasmtime" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "51e762e163fd305770c6c341df3290f0cabb3c264e7952943018e9a1ced8d917" +dependencies = [ + "addr2line", + "anyhow", + "async-trait", + "bitflags", + "bumpalo", + "cc", + "cfg-if", + "encoding_rs", + "fxprof-processed-profile", + "gimli", + "hashbrown 0.14.5", + "indexmap", + "ittapi", + "libc", + "libm", + "log", + "mach2", + "memfd", + "object", + "once_cell", + "paste", + "postcard", + "psm", + "pulley-interpreter", + "rayon", + "rustix", + "semver", + "serde", + "serde_derive", + "serde_json", + "smallvec", + "sptr", + "target-lexicon", + "wasm-encoder 0.218.0", + "wasmparser 0.218.0", + "wasmtime-asm-macros", + "wasmtime-cache", + "wasmtime-component-macro", + "wasmtime-component-util", + "wasmtime-cranelift", + "wasmtime-environ", + "wasmtime-fiber", + "wasmtime-jit-debug", + "wasmtime-jit-icache-coherence", + "wasmtime-slab", + "wasmtime-versioned-export-macros", + "wasmtime-winch", + "wat", + "windows-sys 0.59.0", +] + +[[package]] +name = "wasmtime-asm-macros" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "63caa7aebb546374e26257a1900fb93579171e7c02514cde26805b9ece3ef812" +dependencies = [ + "cfg-if", +] + +[[package]] +name = "wasmtime-cache" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c7192f71e3afe32e858729454d9d90d6e927bd92427d688a9507d8220bddb256" +dependencies = [ + "anyhow", + "base64 0.21.7", + "directories-next", + "log", + "postcard", + "rustix", + "serde", + "serde_derive", + "sha2", + "toml", + "windows-sys 0.59.0", + "zstd", +] + +[[package]] +name = "wasmtime-component-macro" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d61a4b5ce2ad9c15655e830f0eac0c38b8def30c74ecac71f452d3901e491b68" +dependencies = [ + "anyhow", + "proc-macro2", + "quote", + "syn", + "wasmtime-component-util", + "wasmtime-wit-bindgen", + "wit-parser", +] + +[[package]] +name = "wasmtime-component-util" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "35e87a1212270dbb84a49af13d82594e00a92769d6952b0ea7fc4366c949f6ad" + +[[package]] +name = "wasmtime-cranelift" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7cb40dddf38c6a5eefd5ce7c1baf43b00fe44eada11a319fab22e993a960262f" +dependencies = [ + "anyhow", + "cfg-if", + "cranelift-codegen", + "cranelift-control", + "cranelift-entity", + "cranelift-frontend", + "cranelift-native", + "gimli", + "itertools 0.12.1", + "log", + "object", + "smallvec", + "target-lexicon", + "thiserror", + "wasmparser 0.218.0", + "wasmtime-environ", + "wasmtime-versioned-export-macros", +] + +[[package]] +name = "wasmtime-environ" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8613075e89e94a48c05862243c2b718eef1b9c337f51493ebf951e149a10fa19" +dependencies = [ + "anyhow", + "cpp_demangle", + "cranelift-bitset", + "cranelift-entity", + "gimli", + "indexmap", + "log", + "object", + "postcard", + "rustc-demangle", + "semver", + "serde", + "serde_derive", + "smallvec", + "target-lexicon", + "wasm-encoder 0.218.0", + "wasmparser 0.218.0", + "wasmprinter", + "wasmtime-component-util", +] + +[[package]] +name = "wasmtime-fiber" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "77acabfbcd89a4d47ad117fb31e340c824e2f49597105402c3127457b6230995" +dependencies = [ + "anyhow", + "cc", + "cfg-if", + "rustix", + "wasmtime-asm-macros", + "wasmtime-versioned-export-macros", + "windows-sys 0.59.0", +] + +[[package]] +name = "wasmtime-jit-debug" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f02a0118d471de665565ed200bc56673eaa10cc8e223dfe2cef5d50ed0d9d143" +dependencies = [ + "object", + "once_cell", + "rustix", + "wasmtime-versioned-export-macros", +] + +[[package]] +name = "wasmtime-jit-icache-coherence" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "da47fba49af72581bc0dc67c8faaf5ee550e6f106e285122a184a675193701a5" +dependencies = [ + "anyhow", + "cfg-if", + "libc", + "windows-sys 0.59.0", +] + +[[package]] +name = "wasmtime-slab" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "770e10cdefb15f2b6304152978e115bd062753c1ebe7221c0b6b104fa0419ff6" + +[[package]] +name = "wasmtime-versioned-export-macros" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "db8efb877c9e5e67239d4553bb44dd2a34ae5cfb728f3cf2c5e64439c6ca6ee7" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "wasmtime-winch" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4f7a267367382ceec3e7f7ace63a63b83d86f4a680846743dead644e10f08150" +dependencies = [ + "anyhow", + "cranelift-codegen", + "gimli", + "object", + "target-lexicon", + "wasmparser 0.218.0", + "wasmtime-cranelift", + "wasmtime-environ", + "winch-codegen", +] + +[[package]] +name = "wasmtime-wit-bindgen" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4bef2a726fd8d1ee9b0144655e16c492dc32eb4c7c9f7e3309fcffe637870933" +dependencies = [ + "anyhow", + "heck 0.5.0", + "indexmap", + "wit-parser", +] + +[[package]] +name = "wast" +version = "35.0.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2ef140f1b49946586078353a453a1d28ba90adfc54dde75710bc1931de204d68" +dependencies = [ + "leb128", +] + +[[package]] +name = "wast" +version = "221.0.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fcc4470b9de917ba199157d1f0ae104f2ae362be728c43e68c571c7715bd629e" +dependencies = [ + "bumpalo", + "leb128", + "memchr", + "unicode-width 0.2.0", + "wasm-encoder 0.221.2", +] + +[[package]] +name = "wat" +version = "1.221.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6b1f3c6d82af47286494c6caea1d332037f5cbeeac82bbf5ef59cb8c201c466e" +dependencies = [ + "wast 221.0.2", +] + +[[package]] +name = "wayland-backend" +version = "0.3.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "056535ced7a150d45159d3a8dc30f91a2e2d588ca0b23f70e56033622b8016f6" +dependencies = [ + "cc", + "downcast-rs", + "rustix", + "scoped-tls", + "smallvec", + "wayland-sys", +] + +[[package]] +name = "wayland-client" +version = "0.31.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b66249d3fc69f76fd74c82cc319300faa554e9d865dab1f7cd66cc20db10b280" +dependencies = [ + "bitflags", + "rustix", + "wayland-backend", + "wayland-scanner", +] + +[[package]] +name = "wayland-csd-frame" +version = "0.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "625c5029dbd43d25e6aa9615e88b829a5cad13b2819c4ae129fdbb7c31ab4c7e" +dependencies = [ + "bitflags", + "cursor-icon", + "wayland-backend", +] + +[[package]] +name = "wayland-cursor" +version = "0.31.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "32b08bc3aafdb0035e7fe0fdf17ba0c09c268732707dca4ae098f60cb28c9e4c" +dependencies = [ + "rustix", + "wayland-client", + "xcursor", +] + +[[package]] +name = "wayland-protocols" +version = "0.32.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7cd0ade57c4e6e9a8952741325c30bf82f4246885dca8bf561898b86d0c1f58e" +dependencies = [ + "bitflags", + "wayland-backend", + "wayland-client", + "wayland-scanner", +] + +[[package]] +name = "wayland-protocols-wlr" +version = "0.3.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "782e12f6cd923c3c316130d56205ebab53f55d6666b7faddfad36cecaeeb4022" +dependencies = [ + "bitflags", + "wayland-backend", + "wayland-client", + "wayland-protocols", + "wayland-scanner", +] + +[[package]] +name = "wayland-scanner" +version = "0.31.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "597f2001b2e5fc1121e3d5b9791d3e78f05ba6bfa4641053846248e3a13661c3" +dependencies = [ + "proc-macro2", + "quick-xml", + "quote", +] + +[[package]] +name = "wayland-sys" +version = "0.31.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "efa8ac0d8e8ed3e3b5c9fc92c7881406a268e11555abe36493efabe649a29e09" +dependencies = [ + "dlib", + "log", + "once_cell", + "pkg-config", +] + +[[package]] +name = "web-sys" +version = "0.3.76" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "04dd7223427d52553d3702c004d3b2fe07c148165faa56313cb00211e31c12bc" +dependencies = [ + "js-sys", + "wasm-bindgen", +] + +[[package]] +name = "web-time" +version = "1.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5a6580f308b1fad9207618087a65c04e7a10bc77e02c8e84e9b00dd4b12fa0bb" +dependencies = [ + "js-sys", + "wasm-bindgen", +] + +[[package]] +name = "webpki-roots" +version = "0.26.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5d642ff16b7e79272ae451b7322067cdc17cadf68c23264be9d94a32319efe7e" +dependencies = [ + "rustls-pki-types", +] + +[[package]] +name = "wiggle" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b0f25588cf5ea16f56c1af13244486d50c5a2cf67cc0c4e990c665944d741546" +dependencies = [ + "anyhow", + "async-trait", + "bitflags", + "thiserror", + "tracing", + "wasmtime", + "wiggle-macro", + "witx", +] + +[[package]] +name = "wiggle-generate" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "28ff23bed568b335dac6a324b8b167318a0c60555199445fcc89745a5eb42452" +dependencies = [ + "anyhow", + "heck 0.5.0", + "proc-macro2", + "quote", + "shellexpand", + "syn", + "witx", +] + +[[package]] +name = "wiggle-macro" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7f13be83541aa0b033ac5ec8a8b59c9a8d8b32305845b8466dd066e722cb0004" +dependencies = [ + "proc-macro2", + "quote", + "syn", + "wiggle-generate", +] + +[[package]] +name = "winapi" +version = "0.3.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5c839a674fcd7a98952e593242ea400abe93992746761e38641405d28b00f419" +dependencies = [ + "winapi-i686-pc-windows-gnu", + "winapi-x86_64-pc-windows-gnu", +] + +[[package]] +name = "winapi-i686-pc-windows-gnu" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ac3b87c63620426dd9b991e5ce0329eff545bccbbb34f3be09ff6fb6ab51b7b6" + +[[package]] +name = "winapi-util" +version = "0.1.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cf221c93e13a30d793f7645a0e7762c55d169dbb0a49671918a2319d289b10bb" +dependencies = [ + "windows-sys 0.59.0", +] + +[[package]] +name = "winapi-x86_64-pc-windows-gnu" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f" + +[[package]] +name = "winch-codegen" +version = "26.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "07ab957fc71a36c63834b9b51cc2e087c4260d5ff810a5309ab99f7fbeb19567" +dependencies = [ + "anyhow", + "cranelift-codegen", + "gimli", + "regalloc2", + "smallvec", + "target-lexicon", + "wasmparser 0.218.0", + "wasmtime-cranelift", + "wasmtime-environ", +] + +[[package]] +name = "windows-core" +version = "0.52.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "33ab640c8d7e35bf8ba19b884ba838ceb4fba93a4e8c65a9059d08afcfc683d9" +dependencies = [ + "windows-targets 0.52.6", +] + +[[package]] +name = "windows-registry" +version = "0.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e400001bb720a623c1c69032f8e3e4cf09984deec740f007dd2b03ec864804b0" +dependencies = [ + "windows-result", + "windows-strings", + "windows-targets 0.52.6", +] + +[[package]] +name = "windows-result" +version = "0.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1d1043d8214f791817bab27572aaa8af63732e11bf84aa21a45a78d6c317ae0e" +dependencies = [ + "windows-targets 0.52.6", +] + +[[package]] +name = "windows-strings" +version = "0.1.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4cd9b125c486025df0eabcb585e62173c6c9eddcec5d117d3b6e8c30e2ee4d10" +dependencies = [ + "windows-result", + "windows-targets 0.52.6", +] + +[[package]] +name = "windows-sys" +version = "0.48.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "677d2418bec65e3338edb076e806bc1ec15693c5d0104683f2efe857f61056a9" +dependencies = [ + "windows-targets 0.48.5", +] + +[[package]] +name = "windows-sys" +version = "0.52.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "282be5f36a8ce781fad8c8ae18fa3f9beff57ec1b52cb3de0789201425d9a33d" +dependencies = [ + "windows-targets 0.52.6", +] + +[[package]] +name = "windows-sys" +version = "0.59.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1e38bc4d79ed67fd075bcc251a1c39b32a1776bbe92e5bef1f0bf1f8c531853b" +dependencies = [ + "windows-targets 0.52.6", +] + +[[package]] +name = "windows-targets" +version = "0.48.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9a2fa6e2155d7247be68c096456083145c183cbbbc2764150dda45a87197940c" +dependencies = [ + "windows_aarch64_gnullvm 0.48.5", + "windows_aarch64_msvc 0.48.5", + "windows_i686_gnu 0.48.5", + "windows_i686_msvc 0.48.5", + "windows_x86_64_gnu 0.48.5", + "windows_x86_64_gnullvm 0.48.5", + "windows_x86_64_msvc 0.48.5", +] + +[[package]] +name = "windows-targets" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9b724f72796e036ab90c1021d4780d4d3d648aca59e491e6b98e725b84e99973" +dependencies = [ + "windows_aarch64_gnullvm 0.52.6", + "windows_aarch64_msvc 0.52.6", + "windows_i686_gnu 0.52.6", + "windows_i686_gnullvm", + "windows_i686_msvc 0.52.6", + "windows_x86_64_gnu 0.52.6", + "windows_x86_64_gnullvm 0.52.6", + "windows_x86_64_msvc 0.52.6", +] + +[[package]] +name = "windows_aarch64_gnullvm" +version = "0.48.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2b38e32f0abccf9987a4e3079dfb67dcd799fb61361e53e2882c3cbaf0d905d8" + +[[package]] +name = "windows_aarch64_gnullvm" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "32a4622180e7a0ec044bb555404c800bc9fd9ec262ec147edd5989ccd0c02cd3" + +[[package]] +name = "windows_aarch64_msvc" +version = "0.48.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "dc35310971f3b2dbbf3f0690a219f40e2d9afcf64f9ab7cc1be722937c26b4bc" + +[[package]] +name = "windows_aarch64_msvc" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "09ec2a7bb152e2252b53fa7803150007879548bc709c039df7627cabbd05d469" + +[[package]] +name = "windows_i686_gnu" +version = "0.48.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a75915e7def60c94dcef72200b9a8e58e5091744960da64ec734a6c6e9b3743e" + +[[package]] +name = "windows_i686_gnu" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8e9b5ad5ab802e97eb8e295ac6720e509ee4c243f69d781394014ebfe8bbfa0b" + +[[package]] +name = "windows_i686_gnullvm" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0eee52d38c090b3caa76c563b86c3a4bd71ef1a819287c19d586d7334ae8ed66" + +[[package]] +name = "windows_i686_msvc" +version = "0.48.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8f55c233f70c4b27f66c523580f78f1004e8b5a8b659e05a4eb49d4166cca406" + +[[package]] +name = "windows_i686_msvc" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "240948bc05c5e7c6dabba28bf89d89ffce3e303022809e73deaefe4f6ec56c66" + +[[package]] +name = "windows_x86_64_gnu" +version = "0.48.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "53d40abd2583d23e4718fddf1ebec84dbff8381c07cae67ff7768bbf19c6718e" + +[[package]] +name = "windows_x86_64_gnu" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "147a5c80aabfbf0c7d901cb5895d1de30ef2907eb21fbbab29ca94c5b08b1a78" + +[[package]] +name = "windows_x86_64_gnullvm" +version = "0.48.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0b7b52767868a23d5bab768e390dc5f5c55825b6d30b86c844ff2dc7414044cc" + +[[package]] +name = "windows_x86_64_gnullvm" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "24d5b23dc417412679681396f2b49f3de8c1473deb516bd34410872eff51ed0d" + +[[package]] +name = "windows_x86_64_msvc" +version = "0.48.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538" + +[[package]] +name = "windows_x86_64_msvc" +version = "0.52.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "589f6da84c646204747d1270a2a5661ea66ed1cced2631d546fdfb155959f9ec" + +[[package]] +name = "winnow" +version = "0.6.20" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "36c1fec1a2bb5866f07c25f68c26e565c4c200aebb96d7e55710c19d3e8ac49b" +dependencies = [ + "memchr", +] + +[[package]] +name = "winx" +version = "0.36.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3f3fd376f71958b862e7afb20cfe5a22830e1963462f3a17f49d82a6c1d1f42d" +dependencies = [ + "bitflags", + "windows-sys 0.59.0", +] + +[[package]] +name = "wit-parser" +version = "0.218.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0d3d1066ab761b115f97fef2b191090faabcb0f37b555b758d3caf42d4ed9e55" +dependencies = [ + "anyhow", + "id-arena", + "indexmap", + "log", + "semver", + "serde", + "serde_derive", + "serde_json", + "unicode-xid", + "wasmparser 0.218.0", +] + +[[package]] +name = "witx" +version = "0.9.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e366f27a5cabcddb2706a78296a40b8fcc451e1a6aba2fc1d94b4a01bdaaef4b" +dependencies = [ + "anyhow", + "log", + "thiserror", + "wast 35.0.2", +] + +[[package]] +name = "write16" +version = "1.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d1890f4022759daae28ed4fe62859b1236caebfc61ede2f63ed4e695f3f6d936" + +[[package]] +name = "writeable" +version = "0.5.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1e9df38ee2d2c3c5948ea468a8406ff0db0b29ae1ffde1bcf20ef305bcc95c51" + +[[package]] +name = "x11-clipboard" +version = "0.9.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "662d74b3d77e396b8e5beb00b9cad6a9eccf40b2ef68cc858784b14c41d535a3" +dependencies = [ + "libc", + "x11rb", +] + +[[package]] +name = "x11rb" +version = "0.13.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5d91ffca73ee7f68ce055750bf9f6eca0780b8c85eff9bc046a3b0da41755e12" +dependencies = [ + "gethostname", + "rustix", + "x11rb-protocol", +] + +[[package]] +name = "x11rb-protocol" +version = "0.13.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ec107c4503ea0b4a98ef47356329af139c0a4f7750e621cf2973cd3385ebcb3d" + +[[package]] +name = "xcursor" +version = "0.3.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0ef33da6b1660b4ddbfb3aef0ade110c8b8a781a3b6382fa5f2b5b040fd55f61" + +[[package]] +name = "xkeysym" +version = "0.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b9cc00251562a284751c9973bace760d86c0276c471b4be569fe6b068ee97a56" + +[[package]] +name = "yansi" +version = "1.0.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cfe53a6657fd280eaa890a3bc59152892ffa3e30101319d168b781ed6529b049" + +[[package]] +name = "yoke" +version = "0.7.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "120e6aef9aa629e3d4f52dc8cc43a015c7724194c97dfaf45180d2daf2b77f40" +dependencies = [ + "serde", + "stable_deref_trait", + "yoke-derive", + "zerofrom", +] + +[[package]] +name = "yoke-derive" +version = "0.7.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2380878cad4ac9aac1e2435f3eb4020e8374b5f13c296cb75b4620ff8e229154" +dependencies = [ + "proc-macro2", + "quote", + "syn", + "synstructure", +] + +[[package]] +name = "yozefu" +version = "0.0.1" +dependencies = [ + "tokio", + "yozefu-command", +] + +[[package]] +name = "yozefu-app" +version = "0.0.1" +dependencies = [ + "async-trait", + "extism", + "indexmap", + "itertools 0.13.0", + "log", + "rdkafka", + "serde", + "serde_json", + "thousands", + "url", + "yozefu-lib", +] + +[[package]] +name = "yozefu-command" +version = "0.0.1" +dependencies = [ + "chrono", + "clap", + "directories", + "env_logger", + "extism", + "futures", + "indexmap", + "indicatif", + "itertools 0.13.0", + "log", + "rdkafka", + "reqwest", + "serde_json", + "strum", + "tempfile", + "tokio", + "tokio-util", + "yozefu-app", + "yozefu-lib", + "yozefu-tui", +] + +[[package]] +name = "yozefu-lib" +version = "0.0.1" +dependencies = [ + "apache-avro", + "chrono", + "fuzzydate", + "insta", + "itertools 0.13.0", + "nom", + "protobuf", + "rdkafka", + "reqwest", + "serde", + "serde_json", + "strum", + "url", +] + +[[package]] +name = "yozefu-tui" +version = "0.0.1" +dependencies = [ + "bytesize", + "chrono", + "circular-buffer", + "copypasta", + "crossterm", + "futures", + "itertools 0.13.0", + "log", + "nom", + "open", + "ratatui", + "rayon", + "rdkafka", + "serde", + "serde_json", + "strum", + "thousands", + "throbber-widgets-tui", + "tokio", + "tokio-util", + "tui-input", + "yozefu-app", + "yozefu-lib", +] + +[[package]] +name = "yozefu-wasm-types" +version = "0.0.1" +dependencies = [ + "serde", + "serde_json", + "yozefu-lib", +] + +[[package]] +name = "zerocopy" +version = "0.7.35" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1b9b4fd18abc82b8136838da5d50bae7bdea537c574d8dc1a34ed098d6c166f0" +dependencies = [ + "byteorder", + "zerocopy-derive", +] + +[[package]] +name = "zerocopy-derive" +version = "0.7.35" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fa4f8080344d4671fb4e831a13ad1e68092748387dfc4f55e356242fae12ce3e" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "zerofrom" +version = "0.1.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cff3ee08c995dee1859d998dea82f7374f2826091dd9cd47def953cae446cd2e" +dependencies = [ + "zerofrom-derive", +] + +[[package]] +name = "zerofrom-derive" +version = "0.1.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "595eed982f7d355beb85837f651fa22e90b3c044842dc7f2c2842c086f295808" +dependencies = [ + "proc-macro2", + "quote", + "syn", + "synstructure", +] + +[[package]] +name = "zeroize" +version = "1.8.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ced3678a2879b30306d323f4542626697a464a97c0a07c9aebf7ebca65cd4dde" + +[[package]] +name = "zerovec" +version = "0.10.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "aa2b893d79df23bfb12d5461018d408ea19dfafe76c2c7ef6d4eba614f8ff079" +dependencies = [ + "yoke", + "zerofrom", + "zerovec-derive", +] + +[[package]] +name = "zerovec-derive" +version = "0.10.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6eafa6dfb17584ea3e2bd6e76e0cc15ad7af12b09abdd1ca55961bed9b1063c6" +dependencies = [ + "proc-macro2", + "quote", + "syn", +] + +[[package]] +name = "zstd" +version = "0.13.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "fcf2b778a664581e31e389454a7072dab1647606d44f7feea22cd5abb9c9f3f9" +dependencies = [ + "zstd-safe", +] + +[[package]] +name = "zstd-safe" +version = "7.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "54a3ab4db68cea366acc5c897c7b4d4d1b8994a9cd6e6f841f8964566a419059" +dependencies = [ + "zstd-sys", +] + +[[package]] +name = "zstd-sys" +version = "2.0.13+zstd.1.5.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "38ff0f21cfee8f97d94cef41359e0c89aa6113028ab0291aa8ca0038995a95aa" +dependencies = [ + "cc", + "pkg-config", +] diff --git a/Cargo.toml b/Cargo.toml new file mode 100644 index 0000000..d1e8253 --- /dev/null +++ b/Cargo.toml @@ -0,0 +1,69 @@ +[workspace] +members = [ + "crates/lib", + "crates/bin", + "crates/command", + "crates/app", + "crates/tui", + "crates/wasm-types", + # "crates/wasm-blueprints/rust", +] + +default-members = [ + "crates/lib", + "crates/bin", + "crates/command", + "crates/app", + "crates/tui", + "crates/wasm-types", +# "crates/wasm-blueprints/rust", +] + +resolver = "2" + +[workspace.package] +version = "0.0.1" +edition = "2021" +authors = ["Yann Prono "] +readme = "README.md" +homepage = "https://github.com/MAIF/yozefu" +repository = "https://github.com/MAIF/yozefu.git" +license = "Apache-2.0" + + +[workspace.dependencies] +lib = { package = "yozefu-lib", path = "crates/lib/", version = "0.0.1" } +app = { package = "yozefu-app", path = "crates/app/", version = "0.0.1" } +command = { package = "yozefu-command", path = "crates/command/", version = "0.0.1" } +yozefu = { package = "yozefu", path = "crates/bin/", version = "0.0.1" } +tui = { package = "yozefu-tui", path = "crates/tui/", version = "0.0.1" } +wasm-types = { package = "wasm-types", path = "crates/wasm-types/", version = "0.0.1" } + +[profile.release] +opt-level = 3 +debug = "none" +debug-assertions = false +overflow-checks = true +strip = true +lto = "fat" +panic = "abort" +incremental = false + +[workspace.metadata.release] +shared-version = true +tag-message = "chore: Release version {{version}}" +pre-release-commit-message = "chore: Release version {{version}}" +tag-name = "{{version}}" + +[workspace.metadata.cross.target.aarch64-unknown-linux-gnu] +pre-build = [ + "dpkg --add-architecture $CROSS_DEB_ARCH", + "apt-get update && apt-get --assume-yes install libssl-dev:$CROSS_DEB_ARCH", +] + +[workspace.metadata.cross.target.'cfg(target_os = "windows")'] +pre-build = [ + "echo \"VCPKG_ROOT=$env:VCPKG_INSTALLATION_ROOT\" | Out-File -FilePath $env:GITHUB_ENV -Append", + "vcpkg install openssl:x64-windows-static-md", + "vcpkg install openssl:x64-windows-static", +] diff --git a/Dockerfile b/Dockerfile new file mode 100644 index 0000000..45f3f41 --- /dev/null +++ b/Dockerfile @@ -0,0 +1,44 @@ +FROM rust:1.82.0-slim-bullseye AS builder +WORKDIR /app +RUN --mount=type=bind,source=crates,target=crates \ + --mount=type=bind,source=Cargo.toml,target=Cargo.toml \ + --mount=type=bind,source=Cargo.lock,target=Cargo.lock \ + --mount=type=cache,target=/app/target/ \ + --mount=type=cache,target=/usr/local/cargo/registry/ \ + < + +Build status + +Minimum supported Rust version: 1.80.1 or plus +Licence + + +Yōzefu is an interactive terminal user interface (TUI) application for exploring data of a kafka cluster. +It is an alternative tool to [AKHQ](https://akhq.io/), [redpanda console](https://www.redpanda.com/redpanda-console-kafka-ui) or [the kafka plugin for JetBrains IDEs](https://plugins.jetbrains.com/plugin/21704-kafka). + +The tool offers the following features: + - A real-time access to data published to topics. + - A search query language inspired from SQL providing a fine-grained way filtering capabilities. + - Ability to search kafka records across multiple topics. + - Support for extending the search engine with [user-defined filters](./docs/search-filter/README.md) written in WebAssembly ([Extism](https://extism.org/)). + - The tool can be used as a terminal user interface or a CLI with the `--headless` flag. + - One keystroke to export kafka records for further analysis. + - Support for registering multiple kafka clusters, each with specific kafka consumer properties. + + +By default, [the kafka consumer is configured](https://github.com/MAIF/yozefu/blob/main/crates/command/src/command/main_command.rs#L318-L325) with the property `enable.auto.commit` set to `false`, meaning no kafka consumer offset will be published to kafka. + + + + + + Link to a demo video + + + +## Limitations + + - The tool is designed only to consume kafka records. There is no feature to produce records or manage a cluster. + - Serialization formats such as `json`, `xml` or plain text are supported. [Avro](https://avro.apache.org/) support is [experimental for now](./docs/schema-registry/README.md). [Protobuf](https://protobuf.dev/) is not supported. + - The tool uses a ring buffer to store the [last 500 kafka records](./crates/tui/src/records_buffer.rs#L20). + - There is probably room for improvement regarding the throughput (lot of `clone()` and deserialization). + - Yozefu has been tested on MacOS Silicon but not on Windows or Linux. Feedback or contributions are welcome. + + +## Getting started + +> [!NOTE] +> For a better visual experience, I invite you to install [Powerline fonts](https://github.com/powerline/fonts). + +```bash +cargo install yozefu + +# By default, it starts the TUI. +# The default registered cluster is localhost +yozf --cluster localhost + +# You can also start the tool in headless mode. +# It prints the key of each kafka record matching the query in real time +yozf --cluster localhost \ + --headless \ + --topics "public-french-addresses" \ + --format "json" \ + 'from begin value.properties.type contains "street" and offset < 356_234 limit 10' \ + | jq '.key' + + +# Use the `configure` command to define new clusters +yozf configure + +# You can create search filters +yozf create-filter --language rust key-ends-with + +# And import them +yozf import-filter path/to/key-ends-with.wasm +``` + +You can also download pre-build binaries from the [releases section](https://github.com/MAIF/yozefu/releases). [Attestions](https://github.com/MAIF/yozefu/attestations) are available: +```bash +gh attestation verify --repo MAIF/yozefu +``` + + +## Try it + +> [!NOTE] +> Docker is required to start a single node Kafka cluster on your machine. [JBang](https://www.jbang.dev/) is not required but recommended if you want to produce records with the schema registry. + + +```bash +# It clones this repository, starts a docker kafka node and produce some json records +curl -L "https://raw.githubusercontent.com/MAIF/yozefu/refs/heads/main/docs/try-it.sh" | bash + +yozf -c localhost +``` + + +## Documentation + + - [The query language.](./docs/query-language/README.md) + - [Creating a search filter.](./docs/search-filter/README.md) + - [Configuring the tool with TLS.](./docs/tls/README.md) + - [URL templates to switch to web applications.](./docs/url-templates/README.md) + - [Schema registry.](./docs/schema-registry/README.md) + - [Themes.](./docs/themes/README.md) + - [Releasing a new version.](./docs/release/README.md) + diff --git a/compose.yml b/compose.yml new file mode 100644 index 0000000..26640d8 --- /dev/null +++ b/compose.yml @@ -0,0 +1,146 @@ +services: + + kafka: + image: confluentinc/cp-kafka:7.7.1 + container_name: yozefu-kafka + ports: + - "9092:9092" + - "9101:9101" + environment: + KAFKA_NODE_ID: 1 + KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: 'INTERNAL:PLAINTEXT,CONTROLLER:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT' + KAFKA_ADVERTISED_LISTENERS: 'INTERNAL://kafka:19092,PLAINTEXT_HOST://localhost:9092' + KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 + KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0 + KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1 + KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1 + KAFKA_JMX_PORT: 9101 + KAFKA_JMX_HOSTNAME: localhost + KAFKA_PROCESS_ROLES: 'broker,controller' + KAFKA_CONTROLLER_QUORUM_VOTERS: '1@kafka:29093' + KAFKA_LISTENERS: 'INTERNAL://kafka:19092,CONTROLLER://kafka:29093,PLAINTEXT_HOST://0.0.0.0:9092' + KAFKA_INTER_BROKER_LISTENER_NAME: 'INTERNAL' + KAFKA_CONTROLLER_LISTENER_NAMES: 'CONTROLLER' + CLUSTER_ID: 'MkU3OEVBNTcwNTJENDM2Qk' + + schema-registry: + image: confluentinc/cp-schema-registry:7.7.1 + hostname: schema-registry + container_name: yozefu-schema-registry + depends_on: + - kafka + ports: + - "${SCHEMA_REGISTRY_PORT:-8081}:${SCHEMA_REGISTRY_PORT:-8081}" + environment: + SCHEMA_REGISTRY_HOST_NAME: schema-registry + SCHEMA_REGISTRY_KAFKASTORE_BOOTSTRAP_SERVERS: 'kafka:19092' + SCHEMA_REGISTRY_LISTENERS: http://0.0.0.0:${SCHEMA_REGISTRY_PORT:-8081} + healthcheck: + test: ["CMD", "curl", "-f", "http://localhost:${SCHEMA_REGISTRY_PORT:-8081}/subjects"] + interval: 2s + timeout: 1s + retries: 10 + + akhq: + container_name: yozefu-akhq + image: tchiotludo/akhq + restart: unless-stopped + environment: + AKHQ_CONFIGURATION: | + akhq: + connections: + kafka-localhost-server: + properties: + bootstrap.servers: "kafka:19092" + schema-registry: + type: "confluent" + url: "http://schema-registry:${SCHEMA_REGISTRY_PORT:-8081}" + ports: + - "9000:8080" + + redpanda-console: + profiles: ["disabled"] + container_name: yozefu-redpanda-console + image: docker.redpanda.com/redpandadata/console:latest + entrypoint: /bin/sh + command: -c "echo \"$$CONSOLE_CONFIG_FILE\" > /tmp/config.yml; /app/console" + environment: + CONFIG_FILEPATH: /tmp/config.yml + CONSOLE_CONFIG_FILE: | + kafka: + brokers: ["kafka:19092"] + schemaRegistry: + enabled: true + urls: ["http://schema-registry:${SCHEMA_REGISTRY_PORT:-8081}"] + ports: + - "9001:8080" + + kafka-ui: + profiles: ["disabled"] + container_name: yozefu-kafka-ui + image: ghcr.io/kafbat/kafka-ui + ports: + - "9002:8080" + environment: + DYNAMIC_CONFIG_ENABLED: 'true' + + kafdrop: + profiles: ["disabled"] + container_name: yozefu-kafdrop + image: obsidiandynamics/kafdrop + ports: + - "9003:9000" + environment: + KAFKA_BROKERCONNECT: kafka:19092 + CMD_ARGS: "--schemaregistry.connect=http://schema-registry:${SCHEMA_REGISTRY_PORT:-8081}" + + kouncil: + profiles: ["disabled"] + container_name: yozefu-kouncil + image: consdata/kouncil:latest + ports: + - "9004:8080" + environment: + - bootstrapServers="kafka:19092 + + kpow: + profiles: ["disabled"] + image: factorhouse/kpow-ce:latest + container_name: yozefu-kpow + ports: + - "9005:3000" + + provectus-kafka-ui: + profiles: ["disabled"] + container_name: yozefu-provectus + image: provectuslabs/kafka-ui + ports: + - "9006:8080" + environment: + DYNAMIC_CONFIG_ENABLED: 'true' + + blazingkraft: + profiles: ["disabled"] + image: blazinginstruments/blazingkraft:latest + ports: + - "7766:7766" + environment: + - BLAZINGKRAFT_ADMIN_EMAIL=root + - BLAZINGKRAFT_ADMIN_PASSWORD=root + + pulsar: + profiles: ["disabled"] + image: apachepulsar/pulsar:4.0.0 + ports: + - "6650:6650" + - "8080:8080" + environment: + - BLAZINGKRAFT_ADMIN_EMAIL=root + - BLAZINGKRAFT_ADMIN_PASSWORD=root + entrypoint: bin/pulsar standalone + + apicurio: + profiles: ["disabled"] + image: apicurio/apicurio-registry:latest + ports: + - "9007:8080" \ No newline at end of file diff --git a/crates/app/Cargo.toml b/crates/app/Cargo.toml new file mode 100644 index 0000000..eaa83dd --- /dev/null +++ b/crates/app/Cargo.toml @@ -0,0 +1,25 @@ +[package] +name = "yozefu-app" +description = "A TUI for browsing kafka topics" +keywords = ["kafka", "consumer"] +categories = ["command-line-utilities", "development-tools"] +readme = "README.md" +version.workspace = true +authors.workspace = true +edition.workspace = true +homepage.workspace = true +license.workspace = true +repository.workspace = true + +[dependencies] +serde = { version = "1.0.215", features = ["derive"] } +serde_json = { version = "1.0.133", features = ["preserve_order"] } +log = "0.4.22" +lib = { workspace = true, features = ["native"] } +itertools = "0.13.0" +thousands = "0.2.0" +indexmap = "2.7.0" +rdkafka = { version = "0.37.0", features = ["cmake-build"] } +async-trait = "0.1.83" +extism = "1.9.1" +url = { version = "2.5.4", features = ["serde"] } diff --git a/crates/app/LICENSE b/crates/app/LICENSE new file mode 100644 index 0000000..2b04c87 --- /dev/null +++ b/crates/app/LICENSE @@ -0,0 +1,13 @@ +Copyright [2024] yozefu-app + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. \ No newline at end of file diff --git a/crates/app/README.md b/crates/app/README.md new file mode 100644 index 0000000..97e94fc --- /dev/null +++ b/crates/app/README.md @@ -0,0 +1,15 @@ +# yozefu-app + +[![Build](https://github.com/MAIF/yozefu/actions/workflows/build.yml/badge.svg)](https://github.com/MAIF/yozefu/actions/workflows/build.yml) +[![](https://img.shields.io/crates/v/yozefu-app.svg)](https://crates.io/crates/yozefu-app) + + +This library contains the code for the main kafka consumer and all the logic to filter kafka records based on the user-provided search query. + + + +## Usage + +```bash +cargo add yozefu-app +``` \ No newline at end of file diff --git a/crates/app/src/app.rs b/crates/app/src/app.rs new file mode 100644 index 0000000..a86abde --- /dev/null +++ b/crates/app/src/app.rs @@ -0,0 +1,316 @@ +//! This app is both a kafka consumer and a kafka admin client. +use lib::{ + kafka::SchemaRegistryClient, search::offset::FromOffset, ConsumerGroupDetail, Error, + ExportedKafkaRecord, KafkaRecord, TopicDetail, +}; +use log::{info, warn}; +use rdkafka::{ + consumer::BaseConsumer, + consumer::{Consumer, StreamConsumer}, + ClientConfig, Offset, TopicPartitionList, +}; +use thousands::Separable; + +use std::{ + collections::HashSet, + fs, + path::PathBuf, + sync::{LazyLock, Mutex}, + time::Duration, +}; + +use itertools::Itertools; + +use crate::{ + search::{Search, ValidSearchQuery}, + Config, +}; +// TODO this is bad +pub static CONFIG: LazyLock> = LazyLock::new(|| Config::default().into()); + +/// Struct exposing different functions for consuming kafka records. +#[derive(Debug, Clone)] +pub struct App { + pub config: Config, + pub cluster: String, + pub kafka_config: ClientConfig, + pub search_query: ValidSearchQuery, + pub output_file: PathBuf, +} + +impl App { + pub fn new( + config: Config, + cluster: String, + kafka_config: ClientConfig, + search_query: ValidSearchQuery, + output_file: PathBuf, + ) -> Self { + Self { + config, + cluster, + kafka_config, + search_query, + output_file, + } + } + + /// Loads the config file. It's a global variable + /// TODO This is not good + pub fn load_config(config: &Config) { + let mut d = CONFIG.lock().unwrap(); + *d = config.clone(); + } + + pub fn schema_registry(&self) -> Option { + match self.config.schema_registry_config_of(&self.cluster) { + Some(config) => Some(SchemaRegistryClient::new(config.url, &config.headers)), + None => None, + } + } + + /// Create a kafka consumer + pub fn create_consumer(&self, topics: &Vec) -> Result { + let offset = self.search_query.offset().unwrap_or(FromOffset::End); + match offset { + FromOffset::Beginning => self.assign_partitions(topics, Offset::Beginning), + FromOffset::End => self.assign_partitions(topics, Offset::End), + FromOffset::Offset(o) => self.assign_partitions(topics, Offset::Offset(o)), + FromOffset::OffsetTail(o) => self.assign_partitions(topics, Offset::OffsetTail(o)), + FromOffset::Timestamp(timestamp) => { + let consumer: StreamConsumer = self.kafka_config.create()?; + let mut tp = TopicPartitionList::new(); + for t in topics { + let metadata = consumer.fetch_metadata(Some(t), Duration::from_secs(10))?; + for m in metadata.topics() { + for p in m.partitions() { + tp.add_partition(m.name(), p.id()); + } + } + } + tp.set_all_offsets(Offset::Offset(timestamp))?; + let tt = consumer.offsets_for_times(tp, Duration::from_secs(60))?; + consumer.assign(&tt)?; + Ok(consumer) + } + } + } + + /// Exports a given kafka record to a file. + /// The Name of the file is automatically generated at the runtime + pub fn export_record(&self, record: &KafkaRecord) -> Result<(), Error> { + fs::create_dir_all(self.output_file.parent().unwrap())?; + let content = fs::read_to_string(&self.output_file).unwrap_or("[]".to_string()); + let mut exported_records: Vec = serde_json::from_str(&content)?; + + let mut exported_record_kafka: ExportedKafkaRecord = record.into(); + exported_record_kafka.set_search_query(self.search_query.query()); + exported_records.push(exported_record_kafka); + exported_records.sort_by(|a, b| { + a.record + .timestamp + .cmp(&b.record.timestamp) + .then(a.record.offset.cmp(&b.record.offset)) + }); + exported_records.dedup(); + for i in 1..exported_records.len() { + let first_ts = exported_records.first().unwrap().record.timestamp; + let previous_ts = exported_records.get(i - 1).unwrap().record.timestamp; + let current = exported_records.get_mut(i).unwrap(); + current.compute_deltas_ms(first_ts, previous_ts); + } + + fs::write( + &self.output_file, + serde_json::to_string_pretty(&exported_records)?, + )?; + info!( + "A record has been exported into file '{}'", + self.output_file.display() + ); + Ok(()) + } + + /// Calculates an estimate of the number of records that are going to be read. + /// This function is used to render a progress bar. + pub fn estimate_number_of_records_to_read( + &self, + topic_partition_list: TopicPartitionList, + ) -> Result { + let client: StreamConsumer = self.create_assigned_consumer()?; + let mut count = 0; + for t in topic_partition_list.elements() { + // this function call be very slow + let watermarks: (i64, i64) = + match client.fetch_watermarks(t.topic(), t.partition(), Duration::from_secs(10)) { + Ok(i) => i, + Err(e) => { + warn!( + "I was not able to fetch watermarks of topic '{}', partition {}: {}", + t.partition(), + t.topic(), + e + ); + (0, 0) + } + }; + count += match t.offset() { + Offset::Beginning => watermarks.1 - watermarks.0, + Offset::End => 0, + Offset::Stored => 1, + Offset::Invalid => 1, + Offset::Offset(o) => watermarks.1 - o, + Offset::OffsetTail(o) => o, + } + } + + info!( + "{} are about to be consumed on the following topic partitions: [{}]", + count.separate_with_underscores(), + topic_partition_list + .elements() + .iter() + .map(|e| format!("{}-{}", e.topic(), e.partition())) + .join(", ") + ); + Ok(count) + } + + fn create_assigned_consumer(&self) -> Result { + self.kafka_config.create().map_err(|e| e.into()) + } + + /// Assigns topics to a consumer + fn assign_partitions( + &self, + topics: &Vec, + offset: Offset, + ) -> Result { + let consumer = self.create_assigned_consumer()?; + let mut assignments = TopicPartitionList::new(); + for topic in topics { + let metadata = consumer.fetch_metadata(Some(topic), Duration::from_secs(10))?; + for t in metadata.topics() { + for p in t.partitions() { + assignments.add_partition_offset(topic, p.id(), offset)?; + } + } + } + consumer.assign(&assignments)?; + info!("New Consumer created, about to consume {:?}", topics); + Ok(consumer) + } + + /// Returns the topics details for a given list topics + /// This function is not ready yet + pub fn topic_details(&self, topics: HashSet) -> Result, Error> { + let mut results = vec![]; + for topic in topics { + let consumer: BaseConsumer = self.kafka_config.create()?; + let metadata = consumer.fetch_metadata(Some(&topic), Duration::from_secs(10))?; + let metadata = metadata.topics().first().unwrap(); + let mut detail = TopicDetail { + name: topic.clone(), + replicas: metadata.partitions().first().unwrap().replicas().len(), + partitions: metadata.partitions().len(), + consumer_groups: vec![], + }; + let mut consumer_groups = vec![]; + let metadata = consumer.fetch_group_list(None, Duration::from_secs(10))?; + for g in metadata.groups() { + consumer_groups.push(ConsumerGroupDetail { + name: g.name().to_string(), + members: vec![], //Self::parse_members(g, g.members())?, + state: g.state().parse()?, + }); + } + + detail.consumer_groups = consumer_groups; + results.push(detail); + } + + Ok(results) + } + + /// Lists available kafka topics on the cluster. + pub fn list_topics(&self) -> Result, Error> { + let consumer: StreamConsumer = self.create_assigned_consumer()?; + let metadata = consumer.fetch_metadata(None, Duration::from_secs(10))?; + let topics = metadata + .topics() + .iter() + .map(|t| t.name().to_string()) + .collect_vec(); + Ok(topics) + } + + // TODO https://github.com/fede1024/rust-rdkafka/pull/680 + // pub fn parse_members( + // group: &GroupInfo, + // members: &[GroupMemberInfo], + // ) -> Result, anyhow::Error> { + // return Ok(vec![]); + // let members = members + // .iter() + // .map(|member| { + // let mut assigns = Vec::new(); + // if group.protocol_type() == "consumer" { + // if let Some(assignment) = member.assignment() { + // let mut payload_rdr = Cursor::new(assignment); + // assigns = Self::parse_member_assignment(&mut payload_rdr) + // .expect("Parse member assignment failed"); + // } + // } + // ConsumerGroupMember { + // member: member.id().to_owned(), + // start_offset: 0, + // end_offset: 0, + // assignments: assigns, + // } + // }) + // .collect::>(); + // + // Ok(members) + // } + // + // fn parse_member_assignment( + // payload_rdr: &mut Cursor<&[u8]>, + // ) -> Result, anyhow::Error> { + // return Ok(vec![]); + // let _version = payload_rdr.read_i16::()?; + // let assign_len = payload_rdr.read_i32::()?; + // let mut assigns = Vec::with_capacity(assign_len as usize); + // for _ in 0..assign_len { + // let topic = read_str(payload_rdr)?.to_owned(); + // let partition_len = payload_rdr.read_i32::()?; + // let mut partitions = Vec::with_capacity(partition_len as usize); + // for _ in 0..partition_len { + // let partition = payload_rdr.read_i32::()?; + // partitions.push(partition); + // } + // assigns.push(MemberAssignment { topic, partitions }) + // } + // Ok(assigns) + // } + + /// Lists available topics on the cluster with a custom kafka client. + pub fn list_topics_from_client(kafka_config: &ClientConfig) -> Result, Error> { + let consumer: StreamConsumer = kafka_config.create()?; + let metadata = consumer.fetch_metadata(None, Duration::from_secs(3))?; + let topics = metadata + .topics() + .iter() + .map(|t| t.name().to_string()) + .collect_vec(); + Ok(topics) + } +} + +//fn read_str<'a>(rdr: &'a mut Cursor<&[u8]>) -> Result<&'a str, Error> { +// let len = (rdr.read_i16::())? as usize; +// let pos = rdr.position() as usize; +// let slice = str::from_utf8(&rdr.get_ref()[pos..(pos + len)])?; +// rdr.consume(len); +// Ok(slice) +//} +// diff --git a/crates/app/src/config.rs b/crates/app/src/config.rs new file mode 100644 index 0000000..f8a936f --- /dev/null +++ b/crates/app/src/config.rs @@ -0,0 +1,188 @@ +//! module defining the configuration structure of the application + +use indexmap::IndexMap; +use itertools::Itertools; +use lib::Error; +use serde::{Deserialize, Serialize}; +use serde_json::Value; +use std::{ + collections::HashMap, + fs, + path::{Path, PathBuf}, +}; +use url::Url; + +use crate::APPLICATION_NAME; + +static EXAMPLE_PROMPTS: &[&str] = &[ + r#"timestamp between "2 hours ago" and "1 hour ago" limit 100 from beginning"#, + r#"offset > 100000 and value contains "music" limit 10"#, + r#"key == "ABC" and timestamp >= "2 days ago""#, +]; + +/// Configuration of the application +#[derive(Debug, Deserialize, Serialize, PartialEq, Eq, Clone)] +pub struct Config { + /// Path of the config file + #[serde(skip)] + pub path: PathBuf, + /// A placeholder url that will be used when you want to open a kafka record in the browser + #[serde(default = "default_url_template")] + pub default_url_template: String, + /// The initial search query when you start the UI + pub initial_query: String, + #[serde(default = "default_theme")] + pub theme: String, + /// The kafka properties for each cluster + pub clusters: IndexMap, + /// The default kafka properties inherited for every cluster + pub default_kafka_config: IndexMap, + /// History of past search queries + pub history: Vec, + /// Show shortcuts + #[serde(default = "default_show_shortcuts")] + pub show_shortcuts: bool, + #[serde(default = "default_export_directory")] + pub export_directory: PathBuf, +} + +impl Default for ClusterConfig { + fn default() -> Self { + Self { + url_template: Some(default_url_template()), + schema_registry: None, + kafka: Default::default(), + } + } +} + +/// Specific configuration for a cluster +#[derive(Debug, Deserialize, Serialize, PartialEq, Eq, Clone)] +pub struct ClusterConfig { + /// A placeholder url that will be used when you want to open a kafka record in the browser + pub url_template: Option, + /// Schema registry configuration + #[serde(skip_serializing_if = "Option::is_none")] + pub schema_registry: Option, + // Kafka consumer properties for this cluster, see for more details + pub kafka: IndexMap, +} + +/// Schema registry configuration of a given cluster +#[derive(Debug, Deserialize, PartialEq, Eq, Serialize, Clone)] +pub struct SchemaRegistryConfig { + /// Url of the schema registry + pub url: Url, + /// HTTP headers to be used when communicating with the schema registry + #[serde(default = "HashMap::default")] + pub headers: HashMap, +} + +fn default_url_template() -> String { + "http://localhost/cluster/{topic}/{partition}/{offset}".to_string() +} + +fn default_export_directory() -> PathBuf { + PathBuf::from(format!("./{}-exports", APPLICATION_NAME)) +} + +fn default_theme() -> String { + "light".to_string() +} + +fn default_show_shortcuts() -> bool { + true +} + +impl Default for Config { + fn default() -> Self { + let mut kafka_config = IndexMap::new(); + kafka_config.insert("fetch.max.bytes".to_string(), "3000000".to_string()); + Self { + path: PathBuf::default(), + default_url_template: default_url_template(), + history: EXAMPLE_PROMPTS.iter().map(|e| e.to_string()).collect_vec(), + initial_query: "from end - 10".to_string(), + clusters: IndexMap::default(), + default_kafka_config: IndexMap::default(), + theme: default_theme(), + show_shortcuts: true, + export_directory: default_export_directory(), + } + } +} + +impl Config { + pub fn new(path: &Path) -> Self { + Self { + path: path.to_path_buf(), + ..Default::default() + } + } + + /// Reads a configuration file. + pub fn read(file: &Path) -> Result { + let content = fs::read_to_string(file)?; + let mut config: Config = serde_json::from_str(&content)?; + config.path = file.to_path_buf(); + Ok(config) + } + + /// Returns the name of the logs file + pub fn logs_file(&self) -> PathBuf { + self.path.parent().unwrap().join("application.log") + } + + /// Returns the name of the logs file + pub fn themes_file(&self) -> PathBuf { + self.path.parent().unwrap().join("themes.json") + } + + /// Returns the list of available theme names. + pub fn themes(&self) -> Vec { + let file = self.themes_file(); + let content = fs::read_to_string(file).unwrap_or("{}".to_string()); + let themes: HashMap = serde_json::from_str(&content).unwrap_or_default(); + themes.keys().map(|e| e.to_string()).collect_vec() + } + + /// Returns the name of the directory containing wasm filters + pub fn filters_dir(&self) -> PathBuf { + let dir = self.path.parent().unwrap().join("filters"); + let _ = fs::create_dir_all(&dir); + dir + } + + /// web URL template for a given cluster + pub fn url_template_of(&self, cluster: &str) -> String { + self.clusters + .get(cluster) + .and_then(|e| e.url_template.clone()) + .unwrap_or(self.default_url_template.clone()) + } + + /// Returns the kafka properties for the given cluster. + pub fn kafka_config_of(&self, cluster: &str) -> Result, Error> { + let mut config = HashMap::new(); + + config.extend(self.default_kafka_config.clone()); + + if !self.clusters.contains_key(cluster) { + return Err(Error::Error(format!( + "I was not able to find the '{}' cluster. Available clusters are: [{}]", + cluster, + self.clusters.keys().join(", ") + ))); + } + + let env_config = self.clusters.get(cluster.trim()).unwrap(); + config.extend(env_config.kafka.clone()); + Ok(config) + } + + /// Returns the schema registry configuration for the given cluster. + pub fn schema_registry_config_of(&self, cluster: &str) -> Option { + let cluster_config = self.clusters.get(cluster.trim()).unwrap(); + cluster_config.schema_registry.clone() + } +} diff --git a/crates/app/src/lib.rs b/crates/app/src/lib.rs new file mode 100644 index 0000000..9a3f3e8 --- /dev/null +++ b/crates/app/src/lib.rs @@ -0,0 +1,15 @@ +//! The core of the tool: +//! - List topics, and consume records, +//! - Fetch information about a given topic, +//! - Consume records. +mod app; +mod config; +pub mod search; + +pub use app::App; +pub use config::ClusterConfig; +pub use config::Config; +pub use config::SchemaRegistryConfig; + +/// Name of the application +pub const APPLICATION_NAME: &str = "yozefu"; diff --git a/crates/app/src/search/atom.rs b/crates/app/src/search/atom.rs new file mode 100644 index 0000000..20bb613 --- /dev/null +++ b/crates/app/src/search/atom.rs @@ -0,0 +1,34 @@ +use async_trait::async_trait; +use lib::search::{atom::Atom, filter::Filter, offset::FromOffset}; + +use super::{Search, SearchContext}; + +#[async_trait] +impl Search for Atom { + fn offset(&self) -> Option { + match self { + Atom::Symbol(_) => None, + Atom::Compare(c) => c.offset(), + Atom::Parenthesis(c) => c.offset(), + Atom::Filter(_) => None, + } + } + + fn matches(&self, context: &SearchContext) -> bool { + match self { + Atom::Symbol(_) => false, + Atom::Compare(e) => e.matches(context), + Atom::Parenthesis(e) => e.matches(context), + Atom::Filter(f) => f.matches(context), + } + } + + fn filters(&self) -> Vec { + match self { + Atom::Symbol(_) => vec![], + Atom::Compare(e) => e.filters(), + Atom::Parenthesis(e) => e.filters(), + Atom::Filter(f) => vec![f.clone()], + } + } +} diff --git a/crates/app/src/search/compare.rs b/crates/app/src/search/compare.rs new file mode 100644 index 0000000..1b79930 --- /dev/null +++ b/crates/app/src/search/compare.rs @@ -0,0 +1,112 @@ +use async_trait::async_trait; +use lib::{ + kafka::Comparable, + search::{ + compare::{CompareExpression, NumberOperator, StringOperator}, + filter::Filter, + offset::FromOffset, + }, +}; + +use crate::search::Search; + +use super::SearchContext; + +#[async_trait] +impl Search for CompareExpression { + fn offset(&self) -> Option { + match self { + CompareExpression::Offset(NumberOperator::Equal, e) => Some(FromOffset::Offset(*e)), + CompareExpression::Offset(NumberOperator::GreaterOrEqual, e) => { + Some(FromOffset::Offset(*e)) + } + CompareExpression::Offset(NumberOperator::GreaterThan, e) => { + Some(FromOffset::Offset(*e + 1)) + } + CompareExpression::OffsetTail(e) => Some(FromOffset::OffsetTail(*e)), + CompareExpression::Timestamp(op, e) => match op { + NumberOperator::GreaterThan => Some(FromOffset::Timestamp(e.timestamp_millis())), + NumberOperator::GreaterOrEqual => { + Some(FromOffset::Timestamp(e.timestamp_millis() - 1000)) + } + NumberOperator::Equal => Some(FromOffset::Timestamp(e.timestamp_millis() - 1000)), + NumberOperator::LowerThan => None, + NumberOperator::LowerOrEqual => None, + _ => None, + }, + _ => None, + } + } + + fn matches(&self, context: &SearchContext) -> bool { + let record = context.record; + match self { + CompareExpression::OffsetTail(_) => true, + CompareExpression::Partition(op, p) => match op { + &NumberOperator::GreaterThan => record.partition > *p, + NumberOperator::GreaterOrEqual => record.partition >= *p, + NumberOperator::LowerThan => record.partition < *p, + NumberOperator::LowerOrEqual => record.partition <= *p, + NumberOperator::Equal => record.partition == *p, + NumberOperator::NotEqual => record.partition != *p, + }, + CompareExpression::Offset(op, p) => match op { + NumberOperator::GreaterThan => record.offset > *p, + NumberOperator::GreaterOrEqual => record.offset >= *p, + NumberOperator::LowerThan => record.offset < *p, + NumberOperator::LowerOrEqual => record.offset <= *p, + NumberOperator::Equal => record.offset == *p, + NumberOperator::NotEqual => record.offset != *p, + }, + CompareExpression::Topic(op, t) => match op { + StringOperator::Equal => record.topic == *t, + StringOperator::NotEqual => record.topic != *t, + StringOperator::Contain => record.topic.contains(t), + StringOperator::StartWith => record.topic.starts_with(t), + }, + CompareExpression::Size(op, s) => match op { + NumberOperator::GreaterThan => record.size > *s as usize, + NumberOperator::GreaterOrEqual => record.size >= *s as usize, + NumberOperator::LowerThan => record.size < *s as usize, + NumberOperator::LowerOrEqual => record.size <= *s as usize, + NumberOperator::Equal => record.size == *s as usize, + NumberOperator::NotEqual => record.size != *s as usize, + }, + CompareExpression::Key(op, t) => record.key.compare(&None, op, t), + CompareExpression::Value(left, op, t) => record.value.compare(left, op, t), + CompareExpression::Header(left, op, t) => { + let headers = &record.headers; + let header = headers.get(left); + if header.is_none() { + return false; + } + let header = header.unwrap(); + match op { + StringOperator::Contain => header.contains(t), + StringOperator::Equal => header == t, + StringOperator::StartWith => header.starts_with(t), + StringOperator::NotEqual => header != t, + } + } + CompareExpression::Timestamp(op, t) => { + let ts = record.timestamp_as_local_date_time().unwrap(); + match op { + NumberOperator::GreaterThan => ts > *t, + NumberOperator::GreaterOrEqual => ts >= *t, + NumberOperator::LowerThan => ts < *t, + NumberOperator::LowerOrEqual => ts <= *t, + NumberOperator::Equal => ts == *t, + NumberOperator::NotEqual => ts != *t, + } + } + CompareExpression::TimestampBetween(from, to) => { + let ts = record.timestamp_as_local_date_time().unwrap(); + from <= &ts && &ts <= to + } + } + } + + fn filters(&self) -> Vec { + vec![] + } +} diff --git a/crates/app/src/search/expression.rs b/crates/app/src/search/expression.rs new file mode 100644 index 0000000..96e9a39 --- /dev/null +++ b/crates/app/src/search/expression.rs @@ -0,0 +1,82 @@ +use async_trait::async_trait; +use lib::search::{ + expression::{AndExpression, OrExpression}, + filter::Filter, + offset::FromOffset, +}; + +use super::{Search, SearchContext}; + +#[async_trait] +impl Search for AndExpression { + fn offset(&self) -> Option { + match self { + Self::AndTerm(t) => t.offset(), + Self::AndExpression(v) => { + for vv in v { + let o = vv.offset(); + if o.is_some() { + return o; + } + } + None + } + } + } + + fn matches(&self, context: &SearchContext) -> bool { + let record = context; + match self { + Self::AndTerm(t) => t.matches(record), + Self::AndExpression(e) => { + for ee in e { + if !ee.matches(record) { + return false; + } + } + true + } + } + } + + fn filters(&self) -> Vec { + match self { + AndExpression::AndTerm(term) => term.filters(), + AndExpression::AndExpression(vec) => vec.iter().flat_map(|t| t.filters()).collect(), + } + } +} + +#[async_trait] +impl Search for OrExpression { + fn offset(&self) -> Option { + match self { + Self::OrTerm(t) => t.offset(), + Self::OrExpression(_) => None, + } + } + + fn matches(&self, context: &SearchContext) -> bool { + match self { + Self::OrTerm(t) => t.matches(context), + Self::OrExpression(e) => { + if e.is_empty() { + return true; + } + for ee in e { + if ee.matches(context) { + return true; + } + } + false + } + } + } + + fn filters(&self) -> Vec { + match self { + OrExpression::OrTerm(and_expression) => and_expression.filters(), + OrExpression::OrExpression(vec) => vec.iter().flat_map(|e| e.filters()).collect(), + } + } +} diff --git a/crates/app/src/search/filter.rs b/crates/app/src/search/filter.rs new file mode 100644 index 0000000..2ccce99 --- /dev/null +++ b/crates/app/src/search/filter.rs @@ -0,0 +1,50 @@ +use std::{ + collections::HashMap, + path::PathBuf, + sync::{LazyLock, Mutex}, +}; + +use extism::{convert::Json, Plugin}; +use itertools::Itertools; +use lib::{ + search::filter::{Filter, FilterInput}, + FilterResult, +}; + +use super::{Search, SearchContext}; + +// This is evil, TODO context +pub static CACHED_FILTERS: LazyLock>> = + LazyLock::new(|| HashMap::new().into()); + +// This is evil, TODO context +pub static FILTERS_DIR: LazyLock> = LazyLock::new(|| PathBuf::new().into()); + +pub const MATCHES_FUNCTION_NAME: &str = "matches"; +pub const PARSE_PARAMETERS_FUNCTION_NAME: &str = "parse_parameters"; + +impl Search for Filter { + fn matches(&self, context: &SearchContext) -> bool { + let mut filters = CACHED_FILTERS.lock().unwrap(); + let plugin = &mut filters.get_mut(&self.name).unwrap(); + let input = FilterInput { + record: context.record.clone(), + params: self.parameters.iter().map(|e| e.json()).collect_vec(), + }; + + match plugin + .call::>( + MATCHES_FUNCTION_NAME, + serde_json::to_string(&input).unwrap(), + ) + .map(|e| e.0) + { + Ok(res) => res.r#match, + Err(_e) => true, + } + } + + fn filters(&self) -> Vec { + vec![] + } +} diff --git a/crates/app/src/search/mod.rs b/crates/app/src/search/mod.rs new file mode 100644 index 0000000..150d4c6 --- /dev/null +++ b/crates/app/src/search/mod.rs @@ -0,0 +1,129 @@ +//! Module implementing the search logic + +use std::{ + collections::HashMap, + str::FromStr, + sync::{LazyLock, Mutex}, +}; + +use async_trait::async_trait; +use extism::{Manifest, Plugin, Wasm}; +use filter::{CACHED_FILTERS, FILTERS_DIR, PARSE_PARAMETERS_FUNCTION_NAME}; +use itertools::Itertools; +use lib::{ + parse_search_query, + search::{filter::Filter, offset::FromOffset}, + KafkaRecord, SearchQuery, +}; +use log::error; + +pub mod atom; +pub mod compare; +pub mod expression; +pub mod filter; +pub mod search_query; +pub mod term; + +#[async_trait] +pub trait Search { + /// Returns the offset from which the search should start. + fn offset(&self) -> Option { + None + } + /// returns `true` if the record matches the search query. + fn matches(&self, context: &SearchContext) -> bool; + + /// Returns the search filters that are used in the search query. + fn filters(&self) -> Vec; +} + +/// Struct that holds the context of the search. +/// It contains the record that is being searched and the loaded search filters. +pub struct SearchContext<'a> { + pub record: &'a KafkaRecord, + pub filters: &'a LazyLock>>, +} + +impl SearchContext<'_> { + pub fn new(record: &KafkaRecord) -> SearchContext<'_> { + SearchContext { + record, + filters: &CACHED_FILTERS, + } + } +} + +#[derive(Debug, Clone, PartialEq, Default)] +pub struct ValidSearchQuery(SearchQuery); + +impl ValidSearchQuery { + pub fn is_empty(&self) -> bool { + self.0.is_empty() + } + + pub fn limit(&self) -> Option { + self.0.limit + } + + pub fn query(&self) -> &SearchQuery { + &self.0 + } +} + +impl FromStr for ValidSearchQuery { + type Err = lib::Error; + + fn from_str(input: &str) -> Result { + let query = parse_search_query(input).map_err(lib::Error::Search)?.1; + let filters = query.filters(); + let dir = FILTERS_DIR.lock().unwrap().clone(); + for filter in filters { + let name = filter.name; + let path = dir.join(format!("{}.wasm", &name)); + let url = Wasm::file(&path); + let manifest = Manifest::new([url]); + let mut filters = CACHED_FILTERS.lock().unwrap(); + if !filters.contains_key(&name) { + match Plugin::new(manifest, [], true) { + Ok(plugin) => filters.insert(name.to_string(), plugin), + Err(err) => { + error!("No such file '{}': {}", path.display(), err); + return Err(lib::Error::Error(format!( + "Cannot find search filter '{}'", + name + ))); + } + }; + } + let params = filter.parameters; + let wasm_module = &mut filters.get_mut(&name).unwrap(); + if let Err(e) = wasm_module.call::<&str, &str>( + PARSE_PARAMETERS_FUNCTION_NAME, + &serde_json::to_string(¶ms.iter().map(|e| e.json()).collect_vec()).unwrap(), + ) { + error!( + "Error when calling '{}' from wasm module '{}': {:?}", + PARSE_PARAMETERS_FUNCTION_NAME, name, e + ); + return Err(lib::Error::Error(format!("{}: {e}", &name))); + }; + } + + Ok(ValidSearchQuery(query)) + } +} + +impl Search for ValidSearchQuery { + /// Returns the offset from which the search should start. + fn offset(&self) -> Option { + self.0.offset() + } + + fn matches(&self, context: &SearchContext) -> bool { + self.0.matches(context) + } + + fn filters(&self) -> Vec { + self.0.filters() + } +} diff --git a/crates/app/src/search/search_query.rs b/crates/app/src/search/search_query.rs new file mode 100644 index 0000000..bb9be0f --- /dev/null +++ b/crates/app/src/search/search_query.rs @@ -0,0 +1,22 @@ +use async_trait::async_trait; +use lib::{ + search::{filter::Filter, offset::FromOffset}, + SearchQuery, +}; + +use super::{Search, SearchContext}; + +#[async_trait] +impl Search for SearchQuery { + fn offset(&self) -> Option { + self.from.clone().or(self.expression.offset()) + } + + fn matches(&self, context: &SearchContext) -> bool { + self.expression.matches(context) + } + + fn filters(&self) -> Vec { + self.expression.filters() + } +} diff --git a/crates/app/src/search/term.rs b/crates/app/src/search/term.rs new file mode 100644 index 0000000..29d749b --- /dev/null +++ b/crates/app/src/search/term.rs @@ -0,0 +1,28 @@ +use async_trait::async_trait; +use lib::search::{filter::Filter, offset::FromOffset, term::Term}; + +use super::{Search, SearchContext}; + +#[async_trait] +impl Search for Term { + fn offset(&self) -> Option { + match self { + Term::Not(_) => None, + Term::Atom(a) => a.offset(), + } + } + + fn matches(&self, context: &SearchContext) -> bool { + match self { + Term::Not(a) => !a.matches(context), + Term::Atom(a) => a.matches(context), + } + } + + fn filters(&self) -> Vec { + match self { + Term::Not(atom) => atom.filters(), + Term::Atom(atom) => atom.filters(), + } + } +} diff --git a/crates/bin/Cargo.toml b/crates/bin/Cargo.toml new file mode 100644 index 0000000..7fffaae --- /dev/null +++ b/crates/bin/Cargo.toml @@ -0,0 +1,31 @@ +[package] +name = "yozefu" +description = "yozefu is a CLI tool for Apache kafka. It allows you to navigate topics and search Kafka records." +readme = "README.md" +keywords = ["kafka", "consumer", "ratatui"] +categories = [ + "command-line-utilities", + "command-line-interface", + "gui", + "development-tools", +] +version.workspace = true +authors.workspace = true +edition.workspace = true +homepage.workspace = true +license.workspace = true +repository.workspace = true + +[[bin]] +name = "yozf" +path = "src/main.rs" + + +[dependencies] +tokio = { version = "1", features = ["full"] } +command = { workspace = true } + +[features] +ssl-vendored = [ + "command/ssl-vendored" +] diff --git a/crates/bin/LICENSE b/crates/bin/LICENSE new file mode 100644 index 0000000..e5ab6fc --- /dev/null +++ b/crates/bin/LICENSE @@ -0,0 +1,13 @@ +Copyright [2024] yozefu + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. \ No newline at end of file diff --git a/crates/bin/README.md b/crates/bin/README.md new file mode 100644 index 0000000..e270edf --- /dev/null +++ b/crates/bin/README.md @@ -0,0 +1,7 @@ +# yozefu + +[![Build](https://github.com/MAIF/yozefu/actions/workflows/build.yml/badge.svg)](https://github.com/MAIF/yozefu/actions/workflows/build.yml) +[![](https://img.shields.io/crates/v/yozefu.svg)](https://crates.io/crates/yozefu) + + +This is where the `main` function of [yozefu](https://github.com/MAIF/yozefu) lives. \ No newline at end of file diff --git a/crates/bin/src/main.rs b/crates/bin/src/main.rs new file mode 100644 index 0000000..c38f353 --- /dev/null +++ b/crates/bin/src/main.rs @@ -0,0 +1,9 @@ +//! The main entry point. + +use command::{Cli, Parser, TuiError}; + +#[tokio::main] +async fn main() -> Result<(), TuiError> { + let parsed = Cli::::parse(); + parsed.execute().await +} diff --git a/crates/command/Cargo.toml b/crates/command/Cargo.toml new file mode 100644 index 0000000..ef30d47 --- /dev/null +++ b/crates/command/Cargo.toml @@ -0,0 +1,58 @@ +[package] +name = "yozefu-command" +description = "Clap commands of yozefu" +keywords = ["argument", "cli", "arg", "parser", "parse"] +readme = "README.md" +categories = [ + "command-line-utilities", + "development-tools", + "command-line-interface", +] +version.workspace = true +authors.workspace = true +edition.workspace = true +homepage.workspace = true +license.workspace = true +repository.workspace = true + + + +[dependencies] +clap = { version = "4.5.23", features = [ + "derive", + "env", + "color", + "suggestions", +] } +serde_json = { version = "1.0.133", features = ["preserve_order"] } +log = "0.4.22" +env_logger = "0.11.5" +chrono = "0.4.39" +strum = { version = "0.26.3", features = ["derive", "strum_macros"] } +directories = "5.0.1" +indicatif = { version = "0.17.9", features = ["tokio"] } +tempfile = "3.14.0" +tokio-util = "0.7.13" +futures = "0.3.31" +itertools = "0.13.0" +tokio = { version = "1", features = ["full", "tracing"] } +rdkafka = { version = "0.37.0", features = [ + "libz-static", + "cmake-build", + "libz-static", + "curl-static", + "libz", + "zstd", + "external-lz4", +] } +extism = { version = "1.9.1" } +indexmap = "2.7.0" +tui = { workspace = true } +app = { workspace = true } +lib = { workspace = true } +reqwest = { version = "0.12.9", features = ["json"] } + +[features] +ssl-vendored = [ + "rdkafka/ssl-vendored" +] diff --git a/crates/command/LICENSE b/crates/command/LICENSE new file mode 100644 index 0000000..b59890b --- /dev/null +++ b/crates/command/LICENSE @@ -0,0 +1,13 @@ +Copyright [2024] yozefu-command + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. \ No newline at end of file diff --git a/crates/command/README.md b/crates/command/README.md new file mode 100644 index 0000000..eea1ddd --- /dev/null +++ b/crates/command/README.md @@ -0,0 +1,20 @@ +# yozefu-command + +[![Build](https://github.com/MAIF/yozefu/actions/workflows/build.yml/badge.svg)](https://github.com/MAIF/yozefu/actions/workflows/build.yml) +[![The crate](https://img.shields.io/crates/v/yozefu-command.svg)](https://crates.io/crates/yozefu-command) + + +This library contains the `clap` commands of [yozefu](https://github.com/MAIF/yozefu): + - `configure` to access to the configuration file. + - `create-filter` to crate a new search filter + - `import-filter` to import the search filter to the tool. + + +The crate also exports a `headless` mode. It is the same application but without the usage of Ratatui. Results are printed to `stdout`. + + +## Usage + +```bash +cargo add yozefu-command +``` \ No newline at end of file diff --git a/crates/command/examples/my_cli.rs b/crates/command/examples/my_cli.rs new file mode 100644 index 0000000..09b0544 --- /dev/null +++ b/crates/command/examples/my_cli.rs @@ -0,0 +1,53 @@ +use clap::Parser; +use rdkafka::ClientConfig; +use strum::{Display, EnumIter, EnumString}; +use tui::TuiError; +use yozefu_command::Cli; + +#[derive(Debug, Clone, PartialEq, Eq, Display, EnumString, EnumIter)] +#[strum(serialize_all = "lowercase")] +enum Cluster { + Localhost, + Test, + Development, + Production, +} + +#[derive(Parser)] +struct MyCli { + #[clap(flatten)] + command: Cli, +} + +impl MyCli { + pub fn kafka_client_config(&self) -> ClientConfig { + let mut config = ClientConfig::new(); + config.set_log_level(rdkafka::config::RDKafkaLogLevel::Emerg); + match self.command.default_command.cluster() { + Cluster::Localhost => { + config.set("bootstrap.servers", "kafka-localhost.acme:9092".to_string()) + } + Cluster::Test => config.set("bootstrap.servers", "kafka-test.acme:9092".to_string()), + Cluster::Development => config.set( + "bootstrap.servers", + "kafka-development.acme:9092".to_string(), + ), + Cluster::Production => config.set( + "bootstrap.servers", + "kafka-production.acme:9092".to_string(), + ), + }; + + config + } + + pub async fn execute(&self) -> Result<(), TuiError> { + self.command.execute_with(self.kafka_client_config()).await + } +} + +#[tokio::main] +async fn main() -> Result<(), String> { + let parsed = MyCli::parse(); + parsed.execute().await.map_err(|e| e.to_string()) +} diff --git a/crates/command/src/cli.rs b/crates/command/src/cli.rs new file mode 100644 index 0000000..16221f4 --- /dev/null +++ b/crates/command/src/cli.rs @@ -0,0 +1,190 @@ +//! The command line argument Parser struct +use crate::command::{Command, MainCommand, UtilityCommands}; +use app::search::filter::FILTERS_DIR; +use app::{ClusterConfig, APPLICATION_NAME}; +use app::{Config, SchemaRegistryConfig}; +use clap::command; +use directories::ProjectDirs; +use lib::Error; +use log::warn; +use rdkafka::ClientConfig; +use reqwest::Url; +use std::fmt::Debug; +use std::fs; +use std::{fmt::Display, path::PathBuf, str::FromStr}; +use tui::error::TuiError; +use tui::Theme; + +pub use clap::Parser; +use indexmap::IndexMap; + +/// Returns the configuration file path. +pub fn config_path() -> PathBuf { + ProjectDirs::from("io", "maif", APPLICATION_NAME) + .unwrap() + .config_dir() + .to_path_buf() + .join("config.json") +} + +/// CLI parser +#[derive(Parser)] +#[command(author, version, about = "A terminal user interface to navigate Kafka topics and search for Kafka records.", name = APPLICATION_NAME, bin_name = APPLICATION_NAME, display_name = APPLICATION_NAME, long_about = None, propagate_version = true, args_conflicts_with_subcommands = true)] +pub struct Cli +where + T: Display + Debug + Clone + Sync + Send + 'static + FromStr, + ::Err: Display, +{ + #[command(subcommand)] + pub subcommands: Option, + #[command(flatten)] + pub default_command: MainCommand, +} + +impl Cli +where + T: Display + Debug + Clone + Sync + Send + 'static + FromStr, + ::Err: Display, +{ + /// Executes the CLI. + /// The kafka config client will be loaded from the default config file. + pub async fn execute(&self) -> Result<(), TuiError> { + self.run(None).await + } + + /// Executes the CLI with a specified kafka config client + pub async fn execute_with(&self, config_client: ClientConfig) -> Result<(), TuiError> { + self.run(Some(config_client)).await + } + + async fn run(&self, config_client: Option) -> Result<(), TuiError> { + init_files().await?; + let filters_dir = Config::read(&config_path())?.filters_dir(); + // TODO this sucks + *FILTERS_DIR.lock().unwrap() = filters_dir; + match &self.subcommands { + Some(c) => c.execute().await.map_err(|e| e.into()), + None => { + let config_client = match config_client { + None => self.kafka_client_config()?, + Some(c) => c, + }; + let command = self.default_command.clone().with_client(config_client); + command.execute().await + } + } + } + + /// Returns the kafka client config + fn kafka_client_config(&self) -> Result { + self.default_command.kafka_client_config() + } +} + +/// Initializes a default configuration file if it does not exist. +/// The default cluster is `localhost`. +async fn init_files() -> Result<(), Error> { + init_config_file()?; + init_themes_file().await?; + Ok(()) +} + +/// Initializes a default configuration file if it does not exist. +/// The default cluster is `localhost`. +fn init_config_file() -> Result { + let path = config_path(); + if fs::metadata(&path).is_ok() { + return Ok(path); + } + let mut config = Config::new(&path); + let mut localhost_config = IndexMap::new(); + localhost_config.insert( + "bootstrap.servers".to_string(), + "localhost:9092".to_string(), + ); + localhost_config.insert("security.protocol".to_string(), "plaintext".to_string()); + localhost_config.insert("broker.address.family".to_string(), "v4".to_string()); + config + .default_kafka_config + .insert("fetch.min.bytes".to_string(), "10000".to_string()); + + config.clusters.insert( + "localhost".into(), + ClusterConfig { + kafka: localhost_config, + schema_registry: Some(SchemaRegistryConfig { + url: Url::parse("http://localhost:8081").unwrap(), + headers: Default::default(), + }), + ..Default::default() + }, + ); + + fs::create_dir_all(config.filters_dir())?; + fs::write(&path, serde_json::to_string_pretty(&config).unwrap()).unwrap(); + #[cfg(unix)] + { + use std::os::unix::fs::PermissionsExt; + let mut perms: fs::Permissions = fs::metadata(&path)?.permissions(); + perms.set_mode(0o600); + fs::set_permissions(&path, perms)?; + } + + Ok(path) +} + +/// Initializes a default configuration file if it does not exist. +/// The default cluster is `localhost`. +async fn init_themes_file() -> Result { + let path = config_path(); + let config = Config::new(&path); + let path = config.themes_file(); + if fs::metadata(&path).is_ok() { + return Ok(path); + } + + let default_theme = Theme::light(); + let mut default_themes = IndexMap::new(); + default_themes.insert(default_theme.name.clone(), default_theme); + + let content = match reqwest::get( + "https://raw.githubusercontent.com/MAIF/yozefu/refs/heads/main/crates/command/themes.json", + ) + .await + { + Ok(response) => match response.status().is_success() { + true => response.text().await.unwrap(), + false => { + warn!("HTTP {} when downloading theme file", response.status()); + serde_json::to_string_pretty(&default_themes).unwrap() + } + }, + Err(e) => { + warn!("Error while downloading theme file: {}", e); + serde_json::to_string_pretty(&default_themes).unwrap() + } + }; + + let e: IndexMap = match serde_json::from_str(&content) { + Ok(themes) => themes, + Err(_) => default_themes, + }; + + fs::write(&path, &serde_json::to_string_pretty(&e)?)?; + Ok(path) +} + +#[test] +pub fn test_conflicts() { + use clap::CommandFactory; + Cli::::command().debug_assert(); +} +#[test] +fn test_valid_themes() { + use std::collections::HashMap; + use tui::Theme; + + let content = include_str!("../themes.json"); + let themes: HashMap = serde_json::from_str(content).unwrap(); + assert!(themes.keys().len() >= 3) +} diff --git a/crates/command/src/command/config_command.rs b/crates/command/src/command/config_command.rs new file mode 100644 index 0000000..ea52342 --- /dev/null +++ b/crates/command/src/command/config_command.rs @@ -0,0 +1,36 @@ +//! Command that prints the config to `stdout`. +//! +//! ```bash +//! yozf config | jq '.clusters | keys' +//! ``` + +use app::Config; +use clap::Args; +use lib::Error; +use log::info; + +use crate::{command::Command, config_path}; + +use super::configure::ConfigureSubCommand; + +#[derive(Debug, Clone, Args)] +pub struct ConfigCommand { + #[command(subcommand)] + pub subcommand: Option, +} + +impl Command for ConfigCommand { + async fn execute(&self) -> Result<(), Error> { + if let Some(ref subcommand) = self.subcommand { + return subcommand.execute().await; + } + + let path = config_path(); + info!("The configuration file is located at '{}'", path.display()); + + let config = Config::read(&path)?; + println!("{}", serde_json::to_string_pretty(&config)?); + + Ok(()) + } +} diff --git a/crates/command/src/command/configure/get_command.rs b/crates/command/src/command/configure/get_command.rs new file mode 100644 index 0000000..00af4b0 --- /dev/null +++ b/crates/command/src/command/configure/get_command.rs @@ -0,0 +1,80 @@ +//! Command to fetch a property of the configuration file. +use std::{collections::HashMap, fs}; + +use crate::{cli::config_path, command::Command as CliCommand}; +use app::Config; +use clap::Args; +use lib::Error; +use serde_json::Value; + +#[derive(Debug, Args, Clone)] +pub struct ConfigureGetCommand { + /// Property you want to read. It must be a JavaScript Object Notation Pointer (RFC 6901) + /// Special keywords are also supported: 'config', 'filters', 'logs' etc... + property: String, +} + +impl CliCommand for ConfigureGetCommand { + async fn execute(&self) -> Result<(), Error> { + let file = config_path(); + let content = fs::read_to_string(&file)?; + let config = serde_json::from_str::(&content)?; + let mut property_name = self.property.clone(); + if !self.property.starts_with('/') { + property_name = format!("/{}", property_name); + } + match config.pointer(&property_name) { + Some(p) => { + match p { + Value::Null => println!("null"), + Value::Bool(v) => println!("{}", v), + Value::Number(v) => println!("{}", v), + Value::String(v) => println!("{}", v), + Value::Array(v) => println!("{}", serde_json::to_string_pretty(&v)?), + Value::Object(v) => println!("{}", serde_json::to_string_pretty(&v)?), + } + + Ok(()) + } + None => { + let config = Config::read(&file)?; + match self.property.as_str() { + "filters" | "filter" | "fn" | "func" | "functions" => { + let paths = fs::read_dir(config.filters_dir())?; + let mut filters = HashMap::new(); + for path in paths { + let n = path.unwrap(); + if n.file_type().unwrap().is_file() + && n.path().extension().map(|s| s == "wasm").unwrap_or(false) + { + filters.insert( + n.path().file_stem().unwrap().to_str().unwrap().to_string(), + n.path(), + ); + } + } + println!("{:}", serde_json::to_string_pretty(&filters)?); + } + "filter_dir" | "filters_dir" | "filters-dir" | "functions-dir" + | "functions_dir" | "function_dir" => { + println!("{:?}", config.filters_dir().display()) + } + "log" | "logs" => println!("{:?}", config.logs_file().display()), + "configuration_file" | "configuration-file" | "config" | "conf" => { + println!("{:?}", file) + } + "directory" | "dir" => println!("{:?}", file.parent().unwrap()), + "themes" => println!("{}", serde_json::to_string_pretty(&config.themes())?), + "themes_file" | "theme_file" => println!("{:?}", config.themes_file()), + _ => { + return Err(Error::Error(format!( + "There is no '{}' property in the config file", + self.property + ))) + } + } + Ok(()) + } + } + } +} diff --git a/crates/command/src/command/configure/mod.rs b/crates/command/src/command/configure/mod.rs new file mode 100644 index 0000000..6e345a4 --- /dev/null +++ b/crates/command/src/command/configure/mod.rs @@ -0,0 +1,90 @@ +//! Module gathering the commands related to configuration: +//! - `configure` +//! - `configure get` +//! - `configure set` + +use std::{fs, process::Command}; + +use app::Config; +use clap::{Args, Subcommand}; +use lib::Error; +use log::info; +use tempfile::tempdir; + +mod get_command; +mod set_command; + +pub use get_command::ConfigureGetCommand; +pub use set_command::ConfigureSetCommand; + +use crate::cli::config_path; + +use super::default_editor; + +/// Command to edit the configuration file. +#[derive(Debug, Args, Clone)] +pub struct ConfigureCommand { + /// Your favorite code editor + #[clap(short, long)] + pub editor: Option, + #[command(subcommand)] + pub subcommand: Option, +} + +#[derive(Debug, Subcommand, Clone)] +pub enum ConfigureSubCommand { + /// Read a specific property from the config file + Get(ConfigureGetCommand), + /// Edit a specific property from the config file + Set(ConfigureSetCommand), +} + +impl crate::command::Command for ConfigureSubCommand { + async fn execute(&self) -> Result<(), Error> { + match self { + ConfigureSubCommand::Get(c) => c.execute().await, + ConfigureSubCommand::Set(c) => c.execute().await, + } + } +} + +impl crate::command::Command for ConfigureCommand { + async fn execute(&self) -> Result<(), Error> { + if let Some(ref subcommand) = self.subcommand { + return subcommand.execute().await; + } + + let file = config_path(); + let temp_file = + tempdir()?.path().join(file.file_name().expect( + "the configuration path should be a file, not a directory or something else", + )); + fs::create_dir_all(temp_file.parent().expect("temp_file.parent() should return something unless the configuration file is at the root of your file system?"))?; + fs::copy(&file, &temp_file)?; + + let editor = default_editor(&self.editor); + Command::new(editor) + .arg(&temp_file) + .status() + .expect("Something went wrong"); + + let new_config = fs::read_to_string(&temp_file)?; + fs::remove_file(temp_file)?; + match serde_json::from_str::(&new_config) { + Ok(o) => { + fs::write(&file, serde_json::to_string_pretty(&o)?)?; + info!( + "Configuration file '{}' has been updated successfully.", + config_path().display() + ); + } + Err(e) => { + return Err(Error::Error(format!( + "Your new config file is not valid. Please try again: {:?}", + e + ))); + } + }; + Ok(()) + } +} diff --git a/crates/command/src/command/configure/set_command.rs b/crates/command/src/command/configure/set_command.rs new file mode 100644 index 0000000..16c16c1 --- /dev/null +++ b/crates/command/src/command/configure/set_command.rs @@ -0,0 +1,51 @@ +//! Command to edit the configuration file. + +use std::fs; + +use crate::{cli::config_path, command::Command as CliCommand}; +use app::Config; +use clap::Args; +use lib::Error; +use log::info; +use serde_json::Value; + +#[derive(Debug, Args, Clone)] +pub struct ConfigureSetCommand { + /// Property you want to edit. It must be a JavaScript Object Notation Pointer (RFC 6901) + property: String, + /// Its new value + value: String, +} + +impl CliCommand for ConfigureSetCommand { + async fn execute(&self) -> Result<(), Error> { + let file = config_path(); + + let content = fs::read_to_string(&file)?; + let mut config = serde_json::from_str::(&content)?; + let mut property_name = self.property.clone(); + if !self.property.starts_with('/') { + property_name = format!("/{}", property_name); + } + let property = config.pointer_mut(&property_name); + if property.is_none() { + return Err(Error::Error(format!( + "Property '{}' does not exist", + self.property + ))); + } + let old_value = property.unwrap(); + let new_value = + serde_json::from_str(&self.value).unwrap_or(Value::String(self.value.clone())); + match std::mem::discriminant(old_value) == std::mem::discriminant(&new_value) { + true => { + let _ = std::mem::replace(old_value, new_value); + info!("'{}' is now equal to '{}'", property_name, old_value); + let c: Config = serde_json::from_value(config)?; + fs::write(file, serde_json::to_string_pretty(&c)?)?; + Ok(()) + }, + false => Err(Error::Error(format!("Old value is '{}'. The new value is '{}'. As you can see, these are 2 different types", old_value, new_value))), + } + } +} diff --git a/crates/command/src/command/create_filter.rs b/crates/command/src/command/create_filter.rs new file mode 100644 index 0000000..29493c8 --- /dev/null +++ b/crates/command/src/command/create_filter.rs @@ -0,0 +1,122 @@ +//! Command to create a new wasm filter. +use std::{ + fs, + path::{Path, PathBuf}, +}; + +use chrono::Local; +use clap::Args; +use lib::Error; +use log::{info, warn}; +use strum::{Display, EnumIter, EnumString}; + +use crate::{ + command::{default_editor, Command}, + APPLICATION_NAME, +}; +use std::process::Command as ProcessCommand; + +#[derive(Debug, Clone, Args)] +pub struct CreateFilterCommand { + #[clap(short, long)] + /// The programming language used to build the WebAssembly module + language: SupportedLanguages, + #[clap(long)] + /// Location of the search filter repository + directory: Option, + /// Name of the search filter + name: String, +} + +#[derive(Debug, Clone, EnumString, EnumIter, Display)] +#[strum(serialize_all = "lowercase")] +pub enum SupportedLanguages { + Rust, + Golang, +} + +impl Command for CreateFilterCommand { + async fn execute(&self) -> Result<(), Error> { + let repo_dir = match &self.directory { + Some(d) => d.clone(), + None => std::env::temp_dir().join(format!( + "{}-filter-{}-{}", + APPLICATION_NAME, + self.name.clone(), + Local::now().timestamp() + )), + }; + + let editor = default_editor(&None); + + info!("Cloning the filter repository to '{}'", repo_dir.display()); + let output = ProcessCommand::new("git") + .arg("clone") + .arg("git@github.com:MAIF/yozefu.git") + .arg("--depth") + .arg("1") + .arg(&repo_dir) + .spawn()? + .wait()?; + + match output.success() { + true => { + self.prepare_git_repository(&repo_dir)?; + + info!( + "The filter repository has been initialized: '{}'", + repo_dir.display() + ); + info!( + "You can now implement your wasm filter in the repository: '{}'", + repo_dir.display() + ); + } + false => { + warn!("I was not able to clone the repository. Please download it manually."); + println!(" mkdir -p '{}'", repo_dir.parent().unwrap().display()); + println!( + " curl -L 'https://github.com/MAIF/yozefu/archive/refs/heads/main.zip'" + ); + println!(" unzip yozefu-main.zip -d ."); + println!(" mv 'yozefu-main' {}", repo_dir.display()); + } + } + + println!(" {} '{}'", editor, repo_dir.display()); + println!(" make -C '{}' build", repo_dir.display()); + let binary = std::env::current_exe()?; + println!( + " {} import-filter '{}' --name '{}'", + binary.file_name().unwrap().to_str().unwrap(), + repo_dir.join("module.wasm").display(), + self.name + ); + println!(" rm '{}'", repo_dir.display()); + Ok(()) + } +} + +impl CreateFilterCommand { + /// Clones the [`wasm-blueprints`](https://github.com/MAIF/yozefu/tree/main/crates/wasm-blueprints) repository + /// and reorganizes directories to keep only the programming language selected by the user. + fn prepare_git_repository(&self, repo_dir: &Path) -> Result<(), Error> { + let source = repo_dir + .join("crates") + .join("wasm-blueprints") + .join(self.language.to_string()); + let temp = repo_dir.parent().unwrap().join( + repo_dir + .file_name() + .map(|e| format!("{}-temp", e.to_str().unwrap())) + .unwrap(), + ); + + fs::rename(source, &temp)?; + fs::remove_dir_all(repo_dir)?; + fs::rename(&temp, repo_dir)?; + info!("Preparing the repository"); + + Ok(()) + } +} diff --git a/crates/command/src/command/import_filter.rs b/crates/command/src/command/import_filter.rs new file mode 100644 index 0000000..cdcdce9 --- /dev/null +++ b/crates/command/src/command/import_filter.rs @@ -0,0 +1,103 @@ +//! Command to import a search filter. +use std::{fs, path::PathBuf}; + +use app::{ + search::filter::{MATCHES_FUNCTION_NAME, PARSE_PARAMETERS_FUNCTION_NAME}, + Config, +}; +use clap::Args; +use extism::{Manifest, Plugin, Wasm}; +use lib::Error; +use log::info; + +use crate::{command::Command, config_path}; + +/// Import a search filter. +/// It also checks that it complies with the tool requirements. +#[derive(Debug, Clone, Args)] +pub struct ImportFilterCommand { + /// Search filter to import + file: PathBuf, + /// Name of the search filter + #[clap(short, long = "name")] + filter_name: Option, + /// Overwrite the search filter file if it already exists + #[clap(long)] + force: bool, +} + +/// Wasm functions a search filter must expose. +pub const REQUIRED_WASM_FUNCTIONS: [&str; 2] = + [PARSE_PARAMETERS_FUNCTION_NAME, MATCHES_FUNCTION_NAME]; + +impl Command for ImportFilterCommand { + async fn execute(&self) -> Result<(), Error> { + let destination = self.destination()?; + let name = self.name(); + if fs::metadata(&destination).is_ok() && !self.force { + return Err(Error::Error(format!("The wasm function '{}' already exists. If you want to import it again, please delete it first or use the '--force' flag.", destination.display()))); + } + + self.check_wasm_module(&self.file)?; + fs::copy(&self.file, &destination)?; + info!("'{}' has been imported successfully", destination.display()); + info!("To use it: `from begin offset > 50 && {}(...)`", name); + + Ok(()) + } +} + +impl ImportFilterCommand { + /// Returns the path to the wasm file. + pub fn destination(&self) -> Result { + let name = self.name(); + let config = Config::read(&config_path())?; + let dir = config.filters_dir(); + fs::create_dir_all(&dir)?; + Ok(dir.join(format!("{}.wasm", name))) + } + + /// Returns the name of the search filter. + pub fn name(&self) -> String { + match &self.filter_name { + Some(name) => name.to_string(), + None => self.file.file_stem().unwrap().to_str().unwrap().to_string(), + } + } + + /// Checks that the search filter complies with the tool requirements. + /// The search filter must expose functions defined in `REQUIRED_WASM_FUNCTIONS`. + fn check_wasm_module(&self, wasm_file: &PathBuf) -> Result<(), Error> { + let url = Wasm::file(wasm_file); + let manifest = Manifest::new([url]); + let mut filter = + Plugin::new(manifest, [], true).map_err(|e| Error::Error(e.to_string()))?; + check_presence_of_functions(&mut filter)?; + Ok(()) + } +} + +fn check_presence_of_functions(plugin: &mut Plugin) -> Result<(), Error> { + for function_name in REQUIRED_WASM_FUNCTIONS.iter() { + match plugin.function_exists(function_name) { + true => info!("'{}' found in the search filter", function_name), + false => { + return Err(Error::Error(format!( + "'{}' is missing in the search filter. Make sure the wasm module exports a '{}' filter", + function_name, function_name + ))) + } + } + } + Ok(()) +} + +//fn check_parse_parameters(plugin: &mut Plugin) -> Result<(), Error> { +// match plugin +// .call::(PARSE_PARAMETERS_FUNCTION_NAME, "[]".to_string()) +// .map(|e| e == 0) +// { +// Ok(_) => Ok(()), +// Err(e) => Err(Error::Error(e.to_string())), +// } +//} diff --git a/crates/command/src/command/main_command.rs b/crates/command/src/command/main_command.rs new file mode 100644 index 0000000..70d75ad --- /dev/null +++ b/crates/command/src/command/main_command.rs @@ -0,0 +1,362 @@ +//! Main command for the CLI. +//! +//! To execute the commande, you need: +//! 1. To call `with_client` with a `ClientConfig` to get a `MainCommandWithClient`. +//! 2. To call `execute` on the `MainCommandWithClient`. + +use std::collections::HashMap; +use std::fmt::Display; +use std::path::{Path, PathBuf}; +use std::str::FromStr; +use std::{fs, io}; + +use app::search::ValidSearchQuery; +use chrono::Local; + +use app::{App, Config}; +use clap::Parser; +use indicatif::{ProgressBar, ProgressDrawTarget, ProgressStyle}; +use itertools::Itertools; +use lib::Error; +use log::{debug, info, warn}; +use rdkafka::ClientConfig; +use strum::{Display, EnumString}; +use tui::error::TuiError; +use tui::Theme; +use tui::{State, Ui}; + +use crate::cli::config_path; +use crate::headless::formatter::{ + JsonFormatter, KafkaFormatter, PlainFormatter, SimpleFormatter, TransposeFormatter, +}; +use crate::headless::Headless; +use crate::log::init_logging_file; +use crate::APPLICATION_NAME; + +use super::main_command_with_client::MainCommandWithClient; + +fn parse_cluster(s: &str) -> Result +where + T: FromStr, + ::Err: Display, +{ + s.parse() + .map_err(|e: ::Err| Error::Error(e.to_string())) +} + +#[derive(Parser, Clone)] +#[command(author, version, about, long_about = None, propagate_version = true)] +pub struct MainCommand +where + T: Display + Clone + Sync + Send + 'static + FromStr, + ::Err: Display, +{ + #[clap(short, long)] + /// Log level set to 'debug' + pub debug: bool, + #[clap(short = 'c', short_alias='e', alias="environment", long, value_parser = parse_cluster::, required_unless_present_any=&["version", "help"])] + /// The cluster to use + cluster: Option, + /// Topics to consume + #[clap( + short, + long, + group = "topic", + use_value_delimiter = true, + value_delimiter = ',' + )] + pub topics: Vec, + /// Override kafka consumer properties, see + #[clap(short, long, use_value_delimiter = true, value_delimiter = ',')] + pub properties: Vec, + #[clap(long)] + /// Disable the TUI, print results in stdout instead. + pub headless: bool, + /// The initial search query. If you start the query with the letter @, the rest should be a filename to read the data from, or - if you want yozefu to read the data from stdin. + query: Vec, + #[clap(long)] + /// Theme to use + pub theme: Option, + #[clap(long, requires = "headless")] + /// Specify the output format of kafka records + pub format: Option, + #[clap(long, requires = "headless")] + /// Disable progress in stderr + pub disable_progress: bool, + #[clap(long, requires = "headless")] + /// Export kafka records in the given file + pub export: bool, + #[clap(short, long)] + /// Name of the file to export kafka records + pub output: Option, + #[clap(long)] + /// Use a specific config file + pub config: Option, +} + +#[derive(Debug, Clone, EnumString, Display)] +#[strum(serialize_all = "lowercase")] +pub enum KafkaFormatterOption { + Transpose, + Simple, + Plain, + Human, + Json, + Log, +} + +impl MainCommand +where + T: Display + Clone + Sync + Send + 'static + FromStr, + ::Err: Display, +{ + /// Create a new `MainCommandWithClient` with a `ClientConfig`. + pub fn with_client(self, client_config: ClientConfig) -> MainCommandWithClient { + MainCommandWithClient::new(self, client_config) + } + + /// Returns the search query to use. + pub(crate) fn query(&self, config: &Config) -> Result { + App::load_config(config); + let q = self.query.join(" ").trim().to_string(); + if q.is_empty() { + return Ok(config.initial_query.clone()); + } + + if q == "-" { + info!("Reading query from stdin"); + let mut buffer = String::new(); + io::stdin().read_line(&mut buffer)?; + return Ok(buffer); + } + + match q.starts_with("@") { + true => { + let query_file = Path::new(&q[1..]); + fs::read_to_string(query_file).map_err(|e| { + Error::Error(format!( + "Cannot read search query from file {:?}: {}", + query_file.display(), + e + )) + }) + } + false => Ok(q), + } + } + + pub(crate) fn config(&self) -> Result { + let path = self.config.clone().unwrap_or(config_path()); + Config::read(&path) + } + + pub fn cluster(&self) -> T { + self.cluster.as_ref().unwrap().clone() + } + + pub fn themes(file: &Path) -> Result, Error> { + let content = fs::read_to_string(file)?; + let themes: HashMap = serde_json::from_str(&content).map_err(|e| { + Error::Error(format!( + "Error while parsing themes file '{}': {}", + file.display(), + e + )) + })?; + Ok(themes) + } + + pub fn load_theme(file: &Path, name: &str) -> Result { + let themes = Self::themes(file)?; + let theme = match themes.get(name) { + Some(theme) => theme, + None => { + warn!("Theme '{}' not found. Available themes are [{}]. Make sure it is defined in '{}'", + + name, + themes.keys().join(", "), + file.display()); + let theme = themes.iter().next().unwrap().1; + info!("Since the theme was not found, I'm going to use the first available theme '{}'", theme.name); + theme + } + }; + Ok(theme.clone()) + } + + /// Starts the app in TUI mode + pub(crate) async fn tui(&self, kafka_client_config: ClientConfig) -> Result<(), TuiError> { + let cluster = self.cluster(); + let config = self.config()?; + let query = self.query(&config)?; + + let theme_name = self.theme.clone().unwrap_or(config.theme.clone()); + let color_palette = Self::load_theme(&config.themes_file(), &theme_name)?; + + let state = State::new(&cluster.to_string(), color_palette, &config); + let mut app = Ui::new( + self.app(&query, kafka_client_config)?, + query, + self.topics.clone(), + state.clone(), + ) + .await?; + + let _ = init_logging_file(self.debug, &config.logs_file()); + app.run(self.topics.clone(), state).await + } + + /// Creates the App + fn app(&self, query: &str, kafka_client_config: ClientConfig) -> Result { + let config = self.config()?; + let search_query = ValidSearchQuery::from_str(query)?; + Ok(App::new( + config, + self.cluster().to_string(), + kafka_client_config, + search_query, + self.output_file()?, + )) + } + + /// Returns the output file to use to store exported kafka records. + pub(crate) fn output_file(&self) -> Result { + let output = match &self.output { + Some(o) => o.clone(), + None => { + let config = self.config()?; + config.export_directory.join(format!( + "export-{}.json", + // Windows does not support ':' in filenames + Local::now() + .to_rfc3339_opts(chrono::SecondsFormat::Secs, false) + .replace(':', "-"), + )) + } + }; + Ok(output) + } + + /// Starts the app in headless mode + pub(crate) async fn headless(&self, kafka_client_config: ClientConfig) -> Result<(), Error> { + let config = self.config()?; + let query = self.query(&config)?; + + let progress = ProgressBar::new(0); + let date = chrono::Utc::now().to_rfc3339_opts(chrono::SecondsFormat::Secs, true); + progress.set_draw_target(ProgressDrawTarget::hidden()); + progress.set_style( + ProgressStyle::with_template(&format!( + "[{date} {{msg:.green}} headless] {{pos}} records read {{per_sec}}" + )) + .map_err(|e| Error::Error(e.to_string()))?, + ); + progress.set_message("INFO"); + + let topics = self.topics(&kafka_client_config)?; + if !self.disable_progress { + progress.set_draw_target(ProgressDrawTarget::stderr()); + } + let app = Headless::new( + self.app(&query, kafka_client_config)?, + &topics, + self.formatter(), + self.export, + progress, + ); + + self.print_full_command(&self.cluster().to_string(), &topics, &query); + + app.run().await?; + self.print_full_command(&self.cluster().to_string(), &topics, &query); + Ok(()) + } + + fn print_full_command(&self, cluster: &str, topics: &[String], query: &str) { + if self.topics.is_empty() { + let binary = std::env::current_exe() + .map(|f| f.file_name().unwrap().to_str().unwrap().to_string()) + .unwrap_or(APPLICATION_NAME.to_string()); + info!( + "Executed command: {} -c {} --headless --topics {} '{}'", + binary, + cluster.to_string(), + topics.join(","), + query + ) + } + } + + /// Lists available topics when the user didn't provide any + fn topics(&self, kafka_config: &ClientConfig) -> Result, Error> { + if !self.topics.is_empty() { + return Ok(self.topics.clone()); + } + let items = App::list_topics_from_client(kafka_config)?; + println!( + "Select topics to consume:\n {}", + items.iter().take(20).join("\n ") + ); + if items.len() > 20 { + println!("... and {} more", items.len() - 20); + } + std::process::exit(1) + } + + /// Creates a formatter for the headless mode + fn formatter(&self) -> Box { + match &self.format { + Some(d) => match d { + KafkaFormatterOption::Transpose => Box::new(TransposeFormatter::new()), + KafkaFormatterOption::Simple => Box::new(SimpleFormatter::new()), + KafkaFormatterOption::Plain => Box::new(PlainFormatter::new()), + KafkaFormatterOption::Json => Box::new(JsonFormatter::new()), + KafkaFormatterOption::Human => Box::new(SimpleFormatter::new()), + KafkaFormatterOption::Log => Box::new(PlainFormatter::new()), + }, + None => Box::new(TransposeFormatter::new()), + } + } + + /// Returns the kafka client config from the configuration file + pub(crate) fn kafka_client_config(&self) -> Result { + let config = self.config()?; + let mut kafka_properties = config.kafka_config_of(&self.cluster().to_string())?; + + for property in &self.properties { + match property.split_once('=') { + Some((key, value)) => { + kafka_properties.insert(key.trim().into(), value.into()); + } + None => { + return Err(Error::Error(format!("Invalid kafka property '{}', expected a '=' symbol to separate the property and its value.", property))); + } + } + } + + // Default properties + for (key, value) in [ + ("group.id", APPLICATION_NAME), + ("enable.auto.commit", "false"), + ] { + if !kafka_properties.contains_key(key) { + kafka_properties.insert(key.into(), value.into()); + } + } + + let mut config = ClientConfig::new(); + config.set_log_level(rdkafka::config::RDKafkaLogLevel::Emerg); + debug!( + "Kafka properties: {:?}", + kafka_properties + .iter() + .map(|(k, v)| format!("{}={}", k, v)) + .join(", ") + ); + for (key, value) in kafka_properties { + config.set(key, value); + } + + Ok(config) + } +} diff --git a/crates/command/src/command/main_command_with_client.rs b/crates/command/src/command/main_command_with_client.rs new file mode 100644 index 0000000..bfce199 --- /dev/null +++ b/crates/command/src/command/main_command_with_client.rs @@ -0,0 +1,52 @@ +//! `MainCommandWithClient` is a wrapper around `MainCommand`. +//! When using `MainCommandWithClient`, you can pass a custom `ClientConfig` instead of using the one generated automatically from the config file. +//! I need it because I want to be able to pass a custom `ClientConfig` with specific kafka consumer properties that are not stored in the config file. + +use std::fmt::Display; +use std::str::FromStr; + +use rdkafka::ClientConfig; +use tui::error::TuiError; + +use crate::log::init_logging_stderr; + +use super::main_command::MainCommand; + +pub struct MainCommandWithClient +where + T: Display + Clone + Sync + Send + 'static + FromStr, + ::Err: Display, +{ + command: MainCommand, + client_config: ClientConfig, +} + +/// T must implement the trait FromStr. +impl MainCommandWithClient +where + T: Display + Clone + Sync + Send + 'static + FromStr, + ::Err: Display, +{ + pub fn new(command: MainCommand, client_config: ClientConfig) -> Self { + Self { + command, + client_config, + } + } + + pub async fn execute(self) -> Result<(), TuiError> { + match self.command.headless { + true => { + let _ = init_logging_stderr(self.command.debug); + self.command + .headless(self.client_config) + .await + .map_err(|e| e.into()) + } + false => { + // Ignore the result, we just want to make sure the logger is initialized + self.command.tui(self.client_config).await + } + } + } +} diff --git a/crates/command/src/command/mod.rs b/crates/command/src/command/mod.rs new file mode 100644 index 0000000..4b55ddc --- /dev/null +++ b/crates/command/src/command/mod.rs @@ -0,0 +1,36 @@ +//! The [`clap`](https://crates.io/crates/clap) commands of the tool +use std::env::var; + +use lib::Error; + +mod config_command; +pub mod configure; +mod create_filter; +mod import_filter; +mod main_command; +mod main_command_with_client; +mod utility_commands; + +pub use create_filter::CreateFilterCommand; +pub use import_filter::ImportFilterCommand; +pub use main_command::MainCommand; +pub use utility_commands::UtilityCommands; + +#[cfg(target_family = "windows")] +pub const DEFAULT_EDITOR: &str = "notepad"; +#[cfg(not(target_family = "windows"))] +pub const DEFAULT_EDITOR: &str = "vim"; + +/// Trait for all commands. +/// I don't know if it's relevant... +pub trait Command: Send { + #[allow(async_fn_in_trait)] + async fn execute(&self) -> Result<(), Error>; +} + +fn default_editor(editor: &Option) -> String { + editor + .clone() + .or(var("EDITOR").ok()) + .unwrap_or(DEFAULT_EDITOR.to_string()) +} diff --git a/crates/command/src/command/utility_commands.rs b/crates/command/src/command/utility_commands.rs new file mode 100644 index 0000000..569fcad --- /dev/null +++ b/crates/command/src/command/utility_commands.rs @@ -0,0 +1,49 @@ +//! Other CLI commands to make the tool user-friendly. + +use clap::Subcommand; +use lib::Error; +use strum::{Display, EnumString}; + +use crate::log::init_logging_stderr; + +use super::{ + config_command::ConfigCommand, configure::ConfigureCommand, Command, CreateFilterCommand, + ImportFilterCommand, +}; + +#[derive(Subcommand, Debug)] +pub enum UtilityCommands { + /// Import a search filter + #[clap(aliases = ["register-filter"])] + ImportFilter(ImportFilterCommand), + /// Helper to create a new WebAssembly search filter + #[clap(alias = "new-filter")] + CreateFilter(CreateFilterCommand), + /// Edit the configuration file + Configure(ConfigureCommand), + /// Print the config to `stdout` + Config(ConfigCommand), +} + +#[derive(Debug, Clone, EnumString, Display)] +#[strum(serialize_all = "lowercase")] +pub enum KafkaFormatterOption { + Transpose, + Simple, + Plain, + Human, + Json, + Log, +} + +impl Command for UtilityCommands { + async fn execute(&self) -> Result<(), Error> { + let _ = init_logging_stderr(false); + match self { + Self::ImportFilter(command) => command.execute().await, + Self::CreateFilter(command) => command.execute().await, + Self::Configure(command) => command.execute().await, + Self::Config(command) => command.execute().await, + } + } +} diff --git a/crates/command/src/headless/formatter/json_formatter.rs b/crates/command/src/headless/formatter/json_formatter.rs new file mode 100644 index 0000000..4b9f97d --- /dev/null +++ b/crates/command/src/headless/formatter/json_formatter.rs @@ -0,0 +1,44 @@ +//! Formats a kafka record as a json object. +//! ```json +//! { +//! "value": "{ \"name\": \"Tarte Tatin\", \"ingredients\": [\"apples\", \"puff pastry\", \"butter\", \"sugar\"], \"instructions\": \"Caramelize apples in butter and sugar, top with puff pastry, and bake.\"},", +//! "key": "1", +//! "topic": "patisserie-delights-dlq", +//! "timestamp": 1727734680195, +//! "partition": 0, +//! "offset": 529896, +//! "headers": { +//! "kafka_dlt-exception-fqcn": "panic: runtime error: invalid memory address or nil pointer dereference", +//! "kafka_dlt-exception-message": "The cooking process has failed", +//! "kafka_dlt-exception-stacktrace": "[signal SIGSEGV: segmentation violation code=0xffffffff addr=0x0 pc=0x20314]", +//! "kafka_dlt-original-offset": "197939", +//! "kafka_dlt-original-partition": "0", +//! "kafka_dlt-original-topic": "patisserie-delights-dlq", +//! "kafka_timestampType": "2024-09-30T22:18:00.193234027Z" +//! } +//! } +//! ``` +use lib::KafkaRecord; + +use super::KafkaFormatter; + +#[derive(Clone)] +pub struct JsonFormatter {} + +impl Default for JsonFormatter { + fn default() -> Self { + Self::new() + } +} + +impl JsonFormatter { + pub fn new() -> Self { + Self {} + } +} + +impl KafkaFormatter for JsonFormatter { + fn fmt(&self, record: &KafkaRecord) -> String { + serde_json::to_string_pretty(&record).unwrap_or("".to_string()) + } +} diff --git a/crates/command/src/headless/formatter/mod.rs b/crates/command/src/headless/formatter/mod.rs new file mode 100644 index 0000000..ad00da9 --- /dev/null +++ b/crates/command/src/headless/formatter/mod.rs @@ -0,0 +1,18 @@ +//! Formatters for displaying kafka records to stdout. + +mod json_formatter; +mod plain_formatter; +mod simple_formatter; +mod transpose_formatter; + +pub use json_formatter::JsonFormatter; +pub use plain_formatter::PlainFormatter; +pub use simple_formatter::SimpleFormatter; +pub use transpose_formatter::TransposeFormatter; + +use lib::KafkaRecord; + +/// A kafka formatter displays a kafka record to stdout. +pub trait KafkaFormatter: Sync + Send { + fn fmt(&self, record: &KafkaRecord) -> String; +} diff --git a/crates/command/src/headless/formatter/plain_formatter.rs b/crates/command/src/headless/formatter/plain_formatter.rs new file mode 100644 index 0000000..ae6c556 --- /dev/null +++ b/crates/command/src/headless/formatter/plain_formatter.rs @@ -0,0 +1,49 @@ +//! A plain formatter formats a given kafka record like this +//! ```log +//! 2023-01-01T01:00:00.000+01:00 hello-world[0][2] - Step 3: Consume data: rpk topic consume my-topic +//! ``` + +use super::KafkaFormatter; +use lib::KafkaRecord; +#[derive(Clone)] +pub struct PlainFormatter {} + +impl Default for PlainFormatter { + fn default() -> Self { + Self::new() + } +} + +impl PlainFormatter { + pub fn new() -> Self { + Self {} + } +} + +impl KafkaFormatter for PlainFormatter { + fn fmt(&self, record: &KafkaRecord) -> String { + let topic = &record.topic; + let split = match topic.len() < 16 { + true => topic.len() - 1, + false => 15, + }; + let split_pos = topic.char_indices().nth_back(split).unwrap().0; + + let prefix = format!( + "{}[{}][{}]", + &topic[split_pos..], + record.partition, + record.offset + ); + format!( + "{} {:<10} {:>15} - {}", + record + .timestamp_as_local_date_time() + .map(|t| t.to_rfc3339_opts(chrono::SecondsFormat::Millis, true)) + .unwrap_or("".to_string()), + prefix, + record.key, + record.value_as_string + ) + } +} diff --git a/crates/command/src/headless/formatter/simple_formatter.rs b/crates/command/src/headless/formatter/simple_formatter.rs new file mode 100644 index 0000000..835edb9 --- /dev/null +++ b/crates/command/src/headless/formatter/simple_formatter.rs @@ -0,0 +1,39 @@ +/// Show a kafka record this way +/// ```log +/// 31234242 key:my-key 42 partition:0 my-topic size:91 +/// ``` +use lib::KafkaRecord; + +use super::KafkaFormatter; + +#[derive(Clone)] +pub struct SimpleFormatter {} + +impl Default for SimpleFormatter { + fn default() -> Self { + Self::new() + } +} + +impl SimpleFormatter { + pub fn new() -> Self { + Self {} + } +} + +impl KafkaFormatter for SimpleFormatter { + fn fmt(&self, record: &KafkaRecord) -> String { + format!( + "{} key:{:<20} {:>17} partition:{:<5} {:<70} size:{}", + record + .timestamp_as_local_date_time() + .map(|t| t.to_rfc3339_opts(chrono::SecondsFormat::Millis, true)) + .unwrap_or("".to_string()), + record.key, + format!("offset:{}", record.offset), + record.partition, + format!("topic:{}", record.topic), + record.size + ) + } +} diff --git a/crates/command/src/headless/formatter/transpose_formatter.rs b/crates/command/src/headless/formatter/transpose_formatter.rs new file mode 100644 index 0000000..ce13527 --- /dev/null +++ b/crates/command/src/headless/formatter/transpose_formatter.rs @@ -0,0 +1,58 @@ +/// Formats a kafka record this way: +/// ```log +/// Topic: my-topic +///Partition: 0 +/// Offset: 42 +///Timestamp: 21673674232 +/// Key: my-key +/// Value: my-value +/// Headers: +/// ``` +use super::KafkaFormatter; +use itertools::Itertools; +use lib::KafkaRecord; + +#[derive(Clone)] +pub struct TransposeFormatter {} + +impl Default for TransposeFormatter { + fn default() -> Self { + Self::new() + } +} + +impl TransposeFormatter { + pub fn new() -> Self { + Self {} + } +} + +impl KafkaFormatter for TransposeFormatter { + fn fmt(&self, record: &KafkaRecord) -> String { + format!( + r#" Topic: {} +Partition: {} + Offset: {} +Timestamp: {} + Key: {} + Value: {} + Headers: {} +"#, + record.topic, + record.partition, + record.offset, + record + .timestamp_as_local_date_time() + .map(|d| d.to_rfc3339_opts(chrono::SecondsFormat::Millis, true)) + .unwrap_or("".to_string()), + record.key, + record.value_as_string, + record + .headers + .iter() + .sorted_by(|a, b| a.0.cmp(b.0)) + .map(|(k, v)| format!("{}='{}'", k, v)) + .join(", ") + ) + } +} diff --git a/crates/command/src/headless/mod.rs b/crates/command/src/headless/mod.rs new file mode 100644 index 0000000..5f17bbd --- /dev/null +++ b/crates/command/src/headless/mod.rs @@ -0,0 +1,123 @@ +//! Module gathering code for the headless mode. + +use app::search::Search; +use app::search::SearchContext; +use app::App; +use rdkafka::message::OwnedMessage; +use rdkafka::Message; +use std::time::Duration; +use std::time::Instant; +use tokio::select; +use tokio::sync::mpsc; + +use futures::{StreamExt, TryStreamExt}; +use indicatif::ProgressBar; +use lib::Error; +use lib::KafkaRecord; +use log::info; +use rdkafka::consumer::Consumer; +use tokio_util::sync::CancellationToken; + +use self::formatter::KafkaFormatter; +pub mod formatter; + +pub struct Headless { + app: App, + pub(crate) topics: Vec, + pub(crate) formatter: Box, + progress: ProgressBar, + export_records: bool, +} + +impl Headless { + pub fn new( + app: App, + topics: &[String], + formatter: Box, + export_records: bool, + progress: ProgressBar, + ) -> Self { + Self { + app, + topics: topics.to_owned(), + formatter, + progress, + export_records, + } + } + + pub async fn run(&self) -> Result<(), Error> { + if self.topics.is_empty() { + return Err("Please specify topics to consume".into()); + } + info!("Creating consumer for topics [{}]", self.topics.join(", ")); + let consumer = self.app.create_consumer(&self.topics)?; + let mut records_channel = tokio::sync::mpsc::unbounded_channel::(); + let search_query = self.app.search_query.clone(); + let token = CancellationToken::new(); + let progress = self.progress.clone(); + progress.enable_steady_tick(Duration::from_secs(10)); + let count = self + .app + .estimate_number_of_records_to_read(consumer.assignment()?)?; + progress.set_length(count as u64); + + let (tx_dd, mut rx_dd) = mpsc::unbounded_channel::(); + let mut schema_registry = self.app.schema_registry().clone(); + let token_cloned = token.clone(); + tokio::spawn(async move { + loop { + let mut limit = 0; + select! { + _ = token_cloned.cancelled() => { + info!("Consumer is about to be cancelled"); + return; + }, + Some(message) = rx_dd.recv() => { + let record = KafkaRecord::parse(message, &mut schema_registry).await; + let context = SearchContext::new(&record); + if search_query.matches(&context) { + records_channel.0.send(record).unwrap(); + limit += 1; + } + if let Some(query_limit) = search_query.limit() { + if limit >= query_limit { + token_cloned.cancel(); + } + } + } + } + } + }); + + tokio::spawn(async move { + let mut current_time = Instant::now(); + let task = consumer + .stream() + .take_until(token.cancelled()) + .try_for_each(|message| { + let message = message.detach(); + let timestamp = message.timestamp().to_millis().unwrap_or_default(); + tx_dd.send(message).unwrap(); + + if current_time.elapsed() > Duration::from_secs(20) { + current_time = Instant::now(); + info!("Checkpoint: {}", timestamp); + } + progress.inc(1); + futures::future::ok(()) + }) + .await; + info!("Consumer is terminated"); + task + }); + + while let Some(record) = records_channel.1.recv().await { + println!("{}", self.formatter.fmt(&record)); + if self.export_records { + self.app.export_record(&record)?; + } + } + Ok(()) + } +} diff --git a/crates/command/src/lib.rs b/crates/command/src/lib.rs new file mode 100644 index 0000000..41f5d62 --- /dev/null +++ b/crates/command/src/lib.rs @@ -0,0 +1,12 @@ +//! Structures and utility functions to build the command line interface of `yozefu`. +//! It relies on the [`clap`](https://crates.io/crates/clap) crate. + +mod cli; +mod command; +mod headless; +mod log; +pub use clap::Parser; +pub use cli::{config_path, Cli}; +pub use tui::TuiError; + +pub use app::APPLICATION_NAME; diff --git a/crates/command/src/log.rs b/crates/command/src/log.rs new file mode 100644 index 0000000..6066749 --- /dev/null +++ b/crates/command/src/log.rs @@ -0,0 +1,50 @@ +//! Logging utilities. + +use chrono::Local; +use lib::Error; +use log::{LevelFilter, SetLoggerError}; +use std::fs::OpenOptions; +use std::io::Write; +use std::path::PathBuf; + +/// Returns the log level based on the debug flag. +fn log_level(is_debug: bool) -> LevelFilter { + match is_debug { + true => LevelFilter::Debug, + false => LevelFilter::Info, + } +} + +/// When the user starts the headless mode, it writes logs to `stderr`. +pub(crate) fn init_logging_stderr(is_debug: bool) -> Result<(), SetLoggerError> { + let level = log_level(is_debug); + let mut logger = env_logger::builder(); + logger + .filter_level(level) + .target(env_logger::Target::Stderr) + .try_init() +} + +/// When the user starts the TUI, it writes logs to a file. +pub(crate) fn init_logging_file(is_debug: bool, path: &PathBuf) -> Result<(), Error> { + let level = log_level(is_debug); + let file = OpenOptions::new().append(true).create(true).open(path)?; + + let target = Box::new(file); + env_logger::builder() + .target(env_logger::Target::Pipe(target)) + .filter_level(level) + .format(|buf, record| { + writeln!( + buf, + "[{} {} {}:{}] {}", + Local::now().to_rfc3339_opts(chrono::SecondsFormat::Millis, false), + record.level(), + record.file().unwrap_or("unknown"), + record.line().unwrap_or(0), + record.args() + ) + }) + .try_init() + .map_err(|e| Error::Error(e.to_string())) +} diff --git a/crates/command/themes.json b/crates/command/themes.json new file mode 100644 index 0000000..b4b2afb --- /dev/null +++ b/crates/command/themes.json @@ -0,0 +1,77 @@ +{ + "light": { + "name": "light", + "fg": "Black", + "bg": "White", + "black": "Black", + "red": "Red", + "green": "Green", + "yellow": "Yellow", + "blue": "Blue", + "magenta": "Magenta", + "cyan": "Cyan", + "white": "White", + "orange": "LightRed", + "focused_border": "Blue", + "bg_focused_selected": "Black", + "fg_focused_selected": "White", + "bg_unfocused_selected": "White", + "fg_unfocused_selected": "Reset", + "bg_disabled": "7", + "fg_disabled": "Black", + "bg_active": "Green", + "fg_active": "Black", + "dialog_border": "Yellow", + "autocomplete": "#646464" + }, + "dark": { + "name": "dark", + "fg": "8", + "bg": "Black", + "black": "Black", + "red": "Red", + "green": "Green", + "yellow": "Yellow", + "blue": "Blue", + "magenta": "Magenta", + "cyan": "Cyan", + "white": "White", + "orange": "LightRed", + "focused_border": "Cyan", + "bg_focused_selected": "Cyan", + "fg_focused_selected": "Black", + "bg_unfocused_selected": "8", + "fg_unfocused_selected": "Black", + "bg_disabled": "Gray", + "fg_disabled": "Black", + "bg_active": "LightGreen", + "fg_active": "Black", + "dialog_border": "Yellow", + "autocomplete": "#646464" + }, + "solarized-dark": { + "name": "solarized-dark", + "fg": "Black", + "bg": "DarkGray", + "black": "Black", + "red": "Red", + "green": "Green", + "yellow": "Yellow", + "blue": "Blue", + "magenta": "Magenta", + "cyan": "Cyan", + "white": "White", + "orange": "LightRed", + "focused_border": "Blue", + "bg_focused_selected": "White", + "fg_focused_selected": "DarkGray", + "bg_unfocused_selected": "12", + "fg_unfocused_selected": "Black", + "bg_disabled": "LightGreen", + "fg_disabled": "DarkGray", + "bg_active": "Green", + "fg_active": "DarkGray", + "dialog_border": "Yellow", + "autocomplete": "#646464" + } +} \ No newline at end of file diff --git a/crates/lib/Cargo.toml b/crates/lib/Cargo.toml new file mode 100644 index 0000000..effd35f --- /dev/null +++ b/crates/lib/Cargo.toml @@ -0,0 +1,38 @@ +[package] +name = "yozefu-lib" +description = "Core library of yozefu" +readme = "README.md" +keywords = ["kafka", "consumer", "records"] +categories = ["data-structures", "parser-implementations"] +version.workspace = true +authors.workspace = true +edition.workspace = true +homepage.workspace = true +license.workspace = true +repository.workspace = true + +[dependencies] +serde = { version = "1.0.215", features = ["derive"] } +serde_json = { version = "1.0.133", features = ["preserve_order"] } +chrono = { version = "0.4.39", features = ["serde"], optional = true } +itertools = "0.13.0" +strum = { version = "0.26.3", features = ["derive", "strum_macros"], optional = true } +fuzzydate = {version = "0.2.2", optional = true } +nom = "8.0.0-beta.1" +rdkafka = { version = "0.37.0", features = ["cmake-build"], optional = true} +url = "2.5.4" +apache-avro = "0.17.0" +reqwest = { version = "0.12.9", features = ["json"] } + + +[dev-dependencies] +insta = { version = "1.41.1", features = ["filters", "glob"] } +protobuf = "3.7.1" + +[features] +native = [ + "dep:chrono", + "dep:rdkafka", + "dep:fuzzydate", + "dep:strum", +] diff --git a/crates/lib/LICENSE b/crates/lib/LICENSE new file mode 100644 index 0000000..c462c3a --- /dev/null +++ b/crates/lib/LICENSE @@ -0,0 +1,13 @@ +Copyright [2024] yozefu-lib + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. \ No newline at end of file diff --git a/crates/lib/README.md b/crates/lib/README.md new file mode 100644 index 0000000..94b5876 --- /dev/null +++ b/crates/lib/README.md @@ -0,0 +1,17 @@ +# yozefu-lib + +[![Build](https://github.com/MAIF/yozefu/actions/workflows/build.yml/badge.svg)](https://github.com/MAIF/yozefu/actions/workflows/build.yml) +[![](https://img.shields.io/crates/v/yozefu-lib.svg)](https://crates.io/crates/yozefu-lib) + + +This crate provides the core definitions for [yozefu](https://github.com/MAIF/yozefu). + - [`KafkaRecord`](./src/kafka/kafka_record.rs), the main structure used everywhere representing a kafka record. + - Definitions of errors that can occur. + - Structures and functions to parse and execute search queries. + + +## Usage + +```bash +cargo add yozefu-lib +``` \ No newline at end of file diff --git a/crates/lib/src/error.rs b/crates/lib/src/error.rs new file mode 100644 index 0000000..f3a5cbd --- /dev/null +++ b/crates/lib/src/error.rs @@ -0,0 +1,99 @@ +//! Error definitions and their implementations of the `From` trait. +use rdkafka::error::KafkaError; +use std::fmt::Display; +use std::str::Utf8Error; +use std::{ + fmt::{self, Formatter}, + num::TryFromIntError, +}; + +/// All kinds of errors that may occur +#[derive(Debug)] +pub enum Error { + Error(String), + KafkaError(KafkaError), + IoError(std::io::Error), + SerdeError(serde_json::Error), + ThemeError(String), + Search(SearchError), + SchemaRegistry(String), + Tokio(String), +} + +#[derive(Debug)] +pub enum SearchError { + Parse(String), +} + +impl std::error::Error for Error {} + +impl Display for Error { + fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result { + match self { + Error::IoError(e) => write!(f, "IO Error: {}", e), + Error::SerdeError(e) => write!(f, "Serde Error: {}", e), + Error::ThemeError(e) => write!(f, "Theme Error: {}", e), + Error::KafkaError(e) => write!(f, "Kafka Error: {}", e), + Error::Error(e) => write!(f, "{}", e), + Error::Tokio(e) => write!(f, "Tokio Error: {}", e), + Error::Search(e) => write!(f, "{}", e), + Error::SchemaRegistry(e) => write!(f, "Schema registry Error: {}", e), + } + } +} + +impl Display for SearchError { + fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result { + match self { + SearchError::Parse(e) => write!(f, "Cannot parse query '{}'", e), + } + } +} + +impl From for Error { + fn from(e: std::fmt::Error) -> Self { + Error::Error(e.to_string()) + } +} + +impl From for Error { + fn from(e: KafkaError) -> Self { + Error::KafkaError(e) + } +} + +impl From for Error { + fn from(e: std::io::Error) -> Self { + Error::IoError(e) + } +} + +impl From for Error { + fn from(e: serde_json::Error) -> Self { + Error::SerdeError(e) + } +} + +impl From<&str> for Error { + fn from(e: &str) -> Self { + Error::Error(e.to_string()) + } +} + +impl From for Error { + fn from(e: TryFromIntError) -> Self { + Error::Error(e.to_string()) + } +} + +impl From for Error { + fn from(e: strum::ParseError) -> Self { + Error::Error(e.to_string()) + } +} + +impl From for Error { + fn from(e: Utf8Error) -> Self { + Error::Error(e.to_string()) + } +} diff --git a/crates/lib/src/kafka/avro.rs b/crates/lib/src/kafka/avro.rs new file mode 100644 index 0000000..96d1e1c --- /dev/null +++ b/crates/lib/src/kafka/avro.rs @@ -0,0 +1,64 @@ +use apache_avro::types::Value; +use serde_json::{Map, Number}; + +/// Converts an Avro value to a JSON value. +pub(crate) fn avro_to_json(value: Value) -> serde_json::Value { + match value { + Value::Null => serde_json::Value::Null, + Value::Boolean(b) => serde_json::Value::Bool(b), + Value::Int(i) => serde_json::Value::Number(Number::from(i)), + Value::Long(l) => serde_json::Value::Number(Number::from(l)), + Value::Float(f) => serde_json::Value::Number(Number::from_f64(f.into()).unwrap()), + Value::Double(f) => serde_json::Value::Number(Number::from_f64(f).unwrap()), + Value::Bytes(vec) => serde_json::Value::Array( + vec.iter() + .map(|b| serde_json::Value::Number(Number::from(*b))) + .collect(), + ), + Value::String(s) => serde_json::Value::String(s), + Value::Fixed(_, vec) => serde_json::Value::Array( + vec.iter() + .map(|b| serde_json::Value::Number(Number::from(*b))) + .collect(), + ), + Value::Enum(_, s) => serde_json::Value::String(s), + Value::Union(_, value) => avro_to_json(*value), + Value::Array(vec) => { + serde_json::Value::Array(vec.iter().map(|v| avro_to_json(v.clone())).collect()) + } + Value::Map(hash_map) => serde_json::Value::Object( + hash_map + .into_iter() + .map(|(k, v)| (k, avro_to_json(v))) + .collect(), + ), + Value::Record(vec) => { + serde_json::Value::Object(vec.into_iter().map(|(k, v)| (k, avro_to_json(v))).collect()) + } + Value::Date(date) => serde_json::Value::Number(Number::from(date)), + Value::TimeMillis(ts) => serde_json::Value::Number(Number::from(ts)), + Value::TimeMicros(ts) => serde_json::Value::Number(Number::from(ts)), + Value::TimestampMillis(ts) => serde_json::Value::Number(Number::from(ts)), + Value::TimestampMicros(ts) => serde_json::Value::Number(Number::from(ts)), + Value::TimestampNanos(ts) => serde_json::Value::Number(Number::from(ts)), + Value::LocalTimestampMillis(ts) => serde_json::Value::Number(Number::from(ts)), + Value::LocalTimestampMicros(ts) => serde_json::Value::Number(Number::from(ts)), + Value::LocalTimestampNanos(ts) => serde_json::Value::Number(Number::from(ts)), + Value::Uuid(uuid) => serde_json::Value::String(uuid.to_string()), + Value::Duration(duration) => { + let mut map = Map::with_capacity(3); + let i: u32 = duration.months().into(); + map.insert("months".to_string(), serde_json::Value::Number(i.into())); + let i: u32 = duration.millis().into(); + map.insert("millis".to_string(), serde_json::Value::Number(i.into())); + let i: u32 = duration.days().into(); + map.insert("days".to_string(), serde_json::Value::Number(i.into())); + serde_json::Value::Object(map) + } + Value::Decimal(_decimal) => serde_json::Value::String( + "Yozefu error: I don't know how to encode a decimal to json. It fails silently" + .to_string(), + ), + Value::BigDecimal(big_decimal) => serde_json::Value::String(big_decimal.to_string()), + } +} diff --git a/crates/lib/src/kafka/data_type.rs b/crates/lib/src/kafka/data_type.rs new file mode 100644 index 0000000..64351ec --- /dev/null +++ b/crates/lib/src/kafka/data_type.rs @@ -0,0 +1,128 @@ +//! As you you, a kafka record is just a bunch of bytes. The key and the value of a record can be of different types. +//! This module defines the different data types supported. +//! More details about the bytes format when using a schema: +use std::fmt::Display; + +use serde::Deserialize; +use serde::Serialize; + +use crate::search::compare::StringOperator; + +#[derive(Clone, Debug, Deserialize, Serialize, Hash, PartialEq, Eq)] +#[serde(untagged)] +pub enum DataType { + Json(serde_json::Value), + String(String), +} + +impl From for serde_json::Value { + fn from(val: DataType) -> Self { + match val { + DataType::Json(value) => value, + DataType::String(s) => serde_json::Value::String(s), + } + } +} + +impl Default for DataType { + fn default() -> Self { + Self::String("".to_string()) + } +} + +pub trait Comparable { + fn compare( + &self, + json_pointer: &Option, + operator: &StringOperator, + right: &str, + ) -> bool; +} + +impl Comparable for DataType { + fn compare( + &self, + json_pointer: &Option, + operator: &StringOperator, + right: &str, + ) -> bool { + match &self { + DataType::Json(value) => Self::compare_json(value, json_pointer, operator, right), + DataType::String(value) => Self::compare_string(value, operator, right), + } + } +} + +impl DataType { + fn compare_json( + value: &serde_json::Value, + json_pointer: &Option, + operator: &StringOperator, + right: &str, + ) -> bool { + let v = match json_pointer { + Some(path) => { + let path = path.replace(['.', '['], "/").replace(']', ""); + match value.pointer(&path) { + Some(d) => match d { + serde_json::Value::Null => "null".to_string(), + serde_json::Value::Bool(v) => v.to_string(), + serde_json::Value::Number(v) => v.to_string(), + serde_json::Value::String(v) => v.to_string(), + serde_json::Value::Array(_) => return false, + serde_json::Value::Object(_) => return false, + }, + None => return false, + } + } + None => serde_json::to_string(value).unwrap(), + }; + match operator { + StringOperator::Contain => v.contains(right), + StringOperator::Equal => v == right, + StringOperator::StartWith => v.starts_with(right), + StringOperator::NotEqual => v != right, + } + } + + fn compare_string(value: &str, operator: &StringOperator, right: &str) -> bool { + match operator { + StringOperator::Contain => value.contains(right), + StringOperator::Equal => value == right, + StringOperator::StartWith => value.starts_with(right), + StringOperator::NotEqual => value != right, + } + } + + pub fn raw(&self) -> String { + match &self { + DataType::Json(value) => match value { + serde_json::Value::Null => "null".to_string(), + serde_json::Value::Bool(b) => b.to_string(), + serde_json::Value::Number(number) => number.to_string(), + serde_json::Value::String(s) => s.to_string(), + serde_json::Value::Array(vec) => serde_json::to_string(vec).unwrap_or_default(), + serde_json::Value::Object(map) => serde_json::to_string(map).unwrap_or_default(), + }, + DataType::String(s) => s.clone(), + } + } + + pub fn to_string_pretty(&self) -> String { + match &self { + DataType::Json(value) => serde_json::to_string_pretty(value).unwrap_or_default(), + DataType::String(s) => s.clone(), + } + } +} + +impl Display for DataType { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match &self { + DataType::Json(value) => { + write!(f, "{}", serde_json::to_string(value).unwrap_or_default()) + } + DataType::String(s) => write!(f, "{}", s), + } + } +} diff --git a/crates/lib/src/kafka/exported_kafka_record.rs b/crates/lib/src/kafka/exported_kafka_record.rs new file mode 100644 index 0000000..7a80b9d --- /dev/null +++ b/crates/lib/src/kafka/exported_kafka_record.rs @@ -0,0 +1,49 @@ +use chrono::{DateTime, Local}; +use serde::{Deserialize, Serialize}; + +use crate::SearchQuery; + +use super::KafkaRecord; + +/// An exported Kafka record is a wrapper around the `KafkaRecord` struct +/// with additional fields for analytics purposes. +#[derive(Clone, Debug, Deserialize, Serialize, Hash, PartialEq, Eq)] +pub struct ExportedKafkaRecord { + #[serde(flatten)] + pub record: KafkaRecord, + date_time: Option>, + /// Milliseconds elapsed since the first exported kafka record. + pub absolute_delta_in_ms: i64, + /// Milliseconds elapsed since the previous exported kafka record. + pub relative_delta_in_ms: i64, + pub search_query: String, +} + +impl ExportedKafkaRecord { + /// Calculate 2 time deltas: + /// - between the current record and the first received. + /// - between the current record and the previous one. + /// + /// The unit is the millisecond. + pub fn compute_deltas_ms(&mut self, first_ts: Option, previous_ts: Option) { + self.relative_delta_in_ms = self.record.timestamp.unwrap_or(0) - previous_ts.unwrap_or(0); + self.absolute_delta_in_ms = self.record.timestamp.unwrap_or(0) - first_ts.unwrap_or(0); + } + + pub fn set_search_query(&mut self, search_query: &SearchQuery) { + self.search_query = search_query.to_string(); + } +} + +impl From<&KafkaRecord> for ExportedKafkaRecord { + fn from(record: &KafkaRecord) -> Self { + let date_time = record.timestamp_as_local_date_time(); + Self { + record: record.clone(), + date_time, + absolute_delta_in_ms: 0, + relative_delta_in_ms: 0, + search_query: "".to_string(), + } + } +} diff --git a/crates/lib/src/kafka/kafka_record.rs b/crates/lib/src/kafka/kafka_record.rs new file mode 100644 index 0000000..04d66ac --- /dev/null +++ b/crates/lib/src/kafka/kafka_record.rs @@ -0,0 +1,215 @@ +#[cfg(feature = "native")] +use apache_avro::from_avro_datum; +use serde::Deserialize; +use serde::Serialize; +use std::collections::BTreeMap; + +/// Inspired of the `[rdkafka::Message]` struct. +/// Currently, we only support utf-8 string keys/values/headers. +#[derive(Clone, Debug, Deserialize, Serialize, Hash, PartialEq, Eq, Default)] +#[serde(rename_all = "lowercase")] +pub struct KafkaRecord { + /// A human readable representation of the value + pub value: DataType, + #[serde(skip_serializing, default)] + /// The value as a string. needed to be displayed in the TUI + pub value_as_string: String, + /// A human readable representation of the value + pub key: DataType, + #[serde(skip_serializing, default)] + pub key_as_string: String, + pub topic: String, + pub timestamp: Option, + pub partition: i32, + pub offset: i64, + pub headers: BTreeMap, + #[serde(skip_serializing_if = "Option::is_none")] + pub key_schema: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub value_schema: Option, + /// Number of bytes in the key + the value + #[serde(default)] + pub size: usize, +} + +#[cfg(feature = "native")] +use chrono::{DateTime, Local, Utc}; +#[cfg(feature = "native")] +use rdkafka::message::{Headers, Message, OwnedMessage}; + +#[cfg(feature = "native")] +use super::avro::avro_to_json; +use super::data_type::DataType; +use super::schema::Schema; +#[cfg(feature = "native")] +use super::schema::SchemaId; +#[cfg(feature = "native")] +use super::schema::SchemaType; +#[cfg(feature = "native")] +use super::schema_registry_client::SchemaResponse; +#[cfg(feature = "native")] +use super::SchemaRegistryClient; + +#[cfg(feature = "native")] +impl KafkaRecord { + pub fn timestamp_as_utc_date_time(&self) -> Option> { + DateTime::from_timestamp_millis(self.timestamp.unwrap_or(0)) + } + + pub fn timestamp_as_local_date_time(&self) -> Option> { + self.timestamp_as_utc_date_time() + .map(DateTime::::from) + } +} + +#[cfg(feature = "native")] +impl KafkaRecord { + pub async fn parse( + owned_message: OwnedMessage, + schema_registry: &mut Option, + ) -> Self { + let mut headers: BTreeMap = BTreeMap::new(); + if let Some(old_headers) = owned_message.headers() { + for header in old_headers.iter() { + headers.insert( + header.key.to_string(), + header + .value + .map(|e| { + String::from_utf8(e.to_vec()).unwrap_or("".to_string()) + }) + .unwrap_or_default(), + ); + } + } + + let size = owned_message.payload().map(|e| e.len()).unwrap_or(0) + + owned_message.key().map(|e| e.len()).unwrap_or(0); + + let (key, key_schema) = + Self::extract_data_and_schema(owned_message.key(), schema_registry).await; + let (value, value_schema) = + Self::extract_data_and_schema(owned_message.payload(), schema_registry).await; + + Self { + value_as_string: value.to_string(), + value, + key_as_string: key.to_string(), + key, + topic: owned_message.topic().to_string(), + timestamp: owned_message.timestamp().to_millis(), + partition: owned_message.partition(), + offset: owned_message.offset(), + headers, + key_schema, + value_schema, + size, + } + } + + fn payload_to_data_type(payload: Option<&[u8]>, schema: &Option) -> DataType { + if schema.is_none() { + return Self::deserialize_json(payload); + }; + + let schema = schema.as_ref().unwrap(); + match schema.schema_type { + Some(SchemaType::Json) => Self::deserialize_json(payload), + Some(SchemaType::Avro) => Self::deserialize_avro(payload, &schema.schema), + Some(SchemaType::Protobuf) => Self::deserialize_protobuf(payload, &schema.schema), + None => Self::deserialize_json(payload), + } + } + + /// Fallback to String if this is not json + /// Will I regret it ? Maybe + fn deserialize_json(payload: Option<&[u8]>) -> DataType { + let payload = payload.unwrap_or_default(); + match serde_json::from_slice(payload) { + Ok(e) => DataType::Json(e), + Err(_e) => DataType::String(String::from_utf8(payload.to_vec()).unwrap_or_default()), + //Err(e) => DataType::String(format!("This is a Yozefu error. The Record can't be deserialized. For your information, avro and protobuf records are not supported yet.\n\n Error: {}\nPayload: {:?}", e, payload.to_vec())), + } + } + + fn deserialize_avro(payload: Option<&[u8]>, schema: &str) -> DataType { + let mut payload = payload.unwrap_or_default(); + let parsed_schema = apache_avro::Schema::parse_str(schema); + if let Err(e) = &parsed_schema { + return DataType::String(format!( + " Yozefu Error: The avro schema could not be parsed. Please check the schema in the schema registry.\n Error: {}\n Payload: {:?}\n String: {}", + e, + payload, + String::from_utf8(payload.to_vec()).unwrap_or_default() + )); + } + let parsed_schema = parsed_schema.unwrap(); + match from_avro_datum(&parsed_schema, &mut payload, None) { + Ok(value) => DataType::Json(avro_to_json(value)), + Err(e) => DataType::String(format!( + " Yozefu Error: According to the schema registry, the record is serialized as avro but there was an issue deserializing the payload: {:?}\n Payload: {:?}\n String: {}", + e, + payload, + String::from_utf8(payload.to_vec()).unwrap_or_default() + )), + } + } + + fn deserialize_protobuf(payload: Option<&[u8]>, schema: &str) -> DataType { + let payload = payload.unwrap_or_default(); + DataType::String(format!( + " Error: Protobuf deserialization is not supported yet in Yozefu. Any contribution is welcome!\n Github: https://github.com/MAIF/yozefu\nPayload: {:?}\n String: {}\n Schema:\n{}", + payload, + String::from_utf8(payload.to_vec()).unwrap_or_default().trim(), + schema, + )) + } + + async fn extract_data_and_schema( + payload: Option<&[u8]>, + schema_registry: &mut Option, + ) -> (DataType, Option) { + let schema_id = SchemaId::parse(payload); + match (schema_id, schema_registry.as_mut()) { + (None, _) => (Self::payload_to_data_type(payload, &None), None), + (Some(id), None) => { + let payload = payload.unwrap_or_default(); + match serde_json::from_slice(payload) { + Ok(e) => (DataType::Json(e), None), + Err(_e) => (DataType::String(format!("Yozefu was not able to retrieve the schema {} because there is no schema registry configured. Please visit https://github.com/MAIF/yozefu/blob/main/docs/schema-registry/README.md for more details.\nPayload: {:?}\n String: {}", id, payload, + String::from_utf8(payload.to_vec()).unwrap_or_default())), None) + } + } + (Some(s), Some(schema_registry)) => { + let p = payload.unwrap_or_default(); + let (schema_response, schema) = match schema_registry.schema(s.0).await { + Ok(Some(d)) => (Some(d.clone()), Some(Schema::new(s, d.schema_type))), + Ok(None) => (None, Some(Schema::new(s, None))), + Err(_e) => { + let payload = payload.unwrap_or_default(); + return (DataType::String( + format!("{}.\nYozefu was not able to retrieve the schema {}.\nPlease make sure the schema registry is correctly configured.\nPayload: {:?}\n String: {}", + _e, + s.0, + payload, + String::from_utf8(payload.to_vec()).unwrap_or_default())), + Some(Schema::new(s, None))); + } + }; + match p.len() <= 5 { + true => ( + Self::payload_to_data_type(payload, &schema_response), + schema, + ), + false => ( + Self::payload_to_data_type( + payload.map(|e| e[5..].as_ref()), + &schema_response, + ), + schema, + ), + } + } + } + } +} diff --git a/crates/lib/src/kafka/mod.rs b/crates/lib/src/kafka/mod.rs new file mode 100644 index 0000000..2c36374 --- /dev/null +++ b/crates/lib/src/kafka/mod.rs @@ -0,0 +1,19 @@ +#[cfg(feature = "native")] +pub mod exported_kafka_record; +#[cfg(feature = "native")] +pub use exported_kafka_record::ExportedKafkaRecord; +#[cfg(feature = "native")] +mod schema_registry_client; +#[cfg(feature = "native")] +pub mod topic; +#[cfg(feature = "native")] +pub use schema_registry_client::SchemaRegistryClient; +#[cfg(feature = "native")] +mod avro; + +mod data_type; +mod kafka_record; +mod schema; +pub use data_type::Comparable; +pub use data_type::DataType; +pub use kafka_record::KafkaRecord; diff --git a/crates/lib/src/kafka/schema.rs b/crates/lib/src/kafka/schema.rs new file mode 100644 index 0000000..8491487 --- /dev/null +++ b/crates/lib/src/kafka/schema.rs @@ -0,0 +1,95 @@ +//! Structs and functions for key and value schemas. +#[cfg(feature = "native")] +use std::{fmt::Display, io::Read}; + +use serde::{Deserialize, Serialize}; + +#[derive(Clone, Debug, Deserialize, Serialize, Hash, PartialEq, Eq, Default)] +pub struct SchemaId(pub u32); + +#[derive(Clone, Debug, Deserialize, Serialize, Hash, PartialEq, Eq)] +#[serde(rename_all = "UPPERCASE")] +pub enum SchemaType { + Json, + Avro, + Protobuf, +} + +#[derive(Clone, Debug, Deserialize, Serialize, Hash, PartialEq, Eq)] +pub struct Schema { + pub id: SchemaId, + #[serde(skip_serializing_if = "Option::is_none")] + pub schema_type: Option, +} + +impl Schema { + pub fn new(id: SchemaId, schema_type: Option) -> Self { + Self { id, schema_type } + } +} + +#[cfg(feature = "native")] +impl Display for SchemaId { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + write!(f, "{}", self.0) + } +} + +#[cfg(feature = "native")] +impl Display for SchemaType { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + SchemaType::Json => write!(f, "json"), + SchemaType::Avro => write!(f, "avro"), + SchemaType::Protobuf => write!(f, "protobuf"), + } + } +} + +impl SchemaId { + pub fn new(id: u32) -> Self { + Self(id) + } +} + +impl Default for SchemaType { + fn default() -> Self { + Self::Json + } +} + +#[cfg(feature = "native")] +const MAGIC_BYTE: u8 = 0; + +#[cfg(feature = "native")] +impl SchemaId { + /// More details at + pub fn parse(payload: Option<&[u8]>) -> Option { + let mut payload = payload.unwrap_or_default(); + let mut magic_byte_and_schema_id_buffer = [0u8; 5]; + match payload.read_exact(&mut magic_byte_and_schema_id_buffer) { + Ok(_) => { + let mut schema_id_buffer = [0u8; 4]; + if magic_byte_and_schema_id_buffer[0] != MAGIC_BYTE { + return None; + } + schema_id_buffer.copy_from_slice(&magic_byte_and_schema_id_buffer[1..]); + + Some(SchemaId(u32::from_be_bytes(schema_id_buffer))) + } + Err(_) => None, + } + } +} + +#[test] +fn test_parse_schema_id() { + assert_eq!(SchemaId::parse(None), None); + assert_eq!(SchemaId::parse(Some(&[0, 0, 0, 0, 0])), Some(SchemaId(0))); + assert_eq!(SchemaId::parse(Some(&[0, 0, 0, 0, 1])), Some(SchemaId(1))); + assert_eq!( + SchemaId::parse(Some(&[0, 0, 0, 4, 2])), + Some(SchemaId(1026)) + ); + assert_eq!(SchemaId::parse(Some(&[54, 0, 0, 0, 1])), None); +} diff --git a/crates/lib/src/kafka/schema_registry_client.rs b/crates/lib/src/kafka/schema_registry_client.rs new file mode 100644 index 0000000..ed5451f --- /dev/null +++ b/crates/lib/src/kafka/schema_registry_client.rs @@ -0,0 +1,129 @@ +use reqwest::header::{self, HeaderMap, HeaderName, HeaderValue}; +use serde::{Deserialize, Serialize}; +use serde_json::Value; +use std::{collections::HashMap, str::FromStr, time::Duration}; +use url::Url; + +use crate::Error; + +use super::schema::SchemaType; + +#[derive(Clone, Debug)] +/// A HTTP client to communicate with a confluent schema registry +struct SimpleSchemaRegistryClient { + url: Url, + client: reqwest::Client, +} + +impl SimpleSchemaRegistryClient { + fn new(url: Url, headers: &HashMap) -> Self { + let mut default_headers = HeaderMap::new(); + // https://docs.confluent.io/platform/current/schema-registry/develop/api.html#content-types + default_headers.insert( + header::ACCEPT, + HeaderValue::from_static("application/vnd.schemaregistry.v1+json"), + ); + for (key, value) in headers { + default_headers.insert( + HeaderName::from_str(key).unwrap(), + HeaderValue::from_str(value).unwrap(), + ); + } + let builder = reqwest::Client::builder() + .timeout(Duration::from_secs(5)) + .default_headers(default_headers); + Self { + url, + client: builder.build().unwrap(), + } + } + + /// Tries to infer the schema type from the schema string + fn compute_schema_type(schema: &SchemaResponse) -> Option { + match &schema.schema_type { + Some(s) => Some(s.clone()), + None => { + // If the schema type is not provided, we try to infer it from the schema + let schema_string = &schema.schema; + match serde_json::from_str::(schema_string) { + Ok(v) => { + // is it avro ? + if v.get("type").is_some() && v.get("namespace").is_some() { + return Some(SchemaType::Avro); + } + // TODO So it should be json ? + // Some(SchemaType::Json) + None + } + Err(_) => { + // is it protobuf ? + if schema_string.contains("proto2") || schema_string.contains("proto3") { + return Some(SchemaType::Protobuf); + } + None + } + } + } + } + } + + async fn schema(&self, id: u32) -> Result, Error> { + // TODO https://github.com/servo/rust-url/issues/333 + let mut url = self.url.clone(); + let mut segments = url.path_segments_mut().unwrap(); + segments.extend(vec!["schemas", "ids", &id.to_string()]); + drop(segments); + + let response = self.client.get(url).send().await; + + match response { + Ok(response) => { + if response.status().is_success() { + let mut json = response.json::().await.unwrap(); + json.schema_type = Self::compute_schema_type(&json); + return Ok(Some(json)); + } + Ok(None) + } + + Err(e) => Err(Error::SchemaRegistry(e.to_string())), + } + } +} + +#[derive(Clone, Debug)] +/// A HTTP client to communicate with a confluent schema registry +/// All schemas are cached +pub struct SchemaRegistryClient { + client: SimpleSchemaRegistryClient, + cache: HashMap, +} + +impl SchemaRegistryClient { + pub fn new(base_url: Url, headers: &HashMap) -> Self { + Self { + client: SimpleSchemaRegistryClient::new(base_url, headers), + cache: HashMap::default(), + } + } + + pub async fn schema(&mut self, id: u32) -> Result, Error> { + match self.cache.get(&id) { + Some(schema) => Ok(Some(schema.clone())), + None => { + let schema = self.client.schema(id).await?; + if let Some(schema) = schema.clone() { + self.cache.insert(id, schema.clone()); + } + Ok(schema) + } + } + } +} + +#[derive(Clone, Debug, Deserialize, Serialize, Hash, PartialEq, Eq)] +#[serde(rename_all = "camelCase")] +pub struct SchemaResponse { + pub schema: String, + pub schema_type: Option, +} diff --git a/crates/lib/src/kafka/topic.rs b/crates/lib/src/kafka/topic.rs new file mode 100644 index 0000000..6916222 --- /dev/null +++ b/crates/lib/src/kafka/topic.rs @@ -0,0 +1,83 @@ +//! Additional struct definitions regarding topics metadata: +//! - List of consumers, their states, the lag... +//! - Number of partitions +//! - Number of replicas + +use serde::{Deserialize, Serialize}; +use strum::{Display, EnumIter, EnumString}; + +/// Information regarding a given topic, their consumers, the number of partitions... +#[derive(Clone, Debug, Hash, PartialEq, Eq, PartialOrd, Default, Ord)] +pub struct TopicDetail { + pub name: String, + pub partitions: usize, + pub replicas: usize, + pub consumer_groups: Vec, +} + +/// Information regarding a given consumer +#[derive(Clone, Debug, Hash, PartialEq, Eq, PartialOrd, Default, Ord)] +pub struct ConsumerGroupDetail { + pub name: String, + pub members: Vec, + pub state: ConsumerGroupState, +} + +/// All the different states of a kafka consumer +#[derive( + Debug, + Clone, + EnumString, + EnumIter, + Display, + Deserialize, + Serialize, + Hash, + PartialEq, + Eq, + PartialOrd, + Ord, + Copy, +)] +#[strum(serialize_all = "PascalCase")] +#[serde(rename_all = "PascalCase")] +#[derive(Default)] +pub enum ConsumerGroupState { + #[default] + Unknown, + Empty, + Dead, + Stable, + PreparingRebalance, + CompletingRebalance, + Rebalancing, + UnknownRebalance, +} + +impl ConsumerGroupDetail { + pub fn lag(&self) -> usize { + self.members + .iter() + .map(|m| m.end_offset - m.start_offset) + .sum() + } + + pub fn state(&self) -> bool { + true + } +} + +/// Information regarding a consumer group member. +#[derive(Clone, Debug, Hash, PartialEq, Eq, PartialOrd, Default, Ord)] +pub struct ConsumerGroupMember { + pub member: String, + pub start_offset: usize, + pub end_offset: usize, + pub assignments: Vec, +} + +#[derive(Clone, Debug, Hash, PartialEq, Eq, PartialOrd, Default, Ord)] +pub struct MemberAssignment { + pub topic: String, + pub partitions: Vec, +} diff --git a/crates/lib/src/lib.rs b/crates/lib/src/lib.rs new file mode 100644 index 0000000..0be5b8b --- /dev/null +++ b/crates/lib/src/lib.rs @@ -0,0 +1,19 @@ +//! This crate contains the core structs and enums for the tool. +//! It heavily relies on the [`rdkafka` crate](https://github.com/fede1024/rust-rdkafka). + +#[cfg(feature = "native")] +pub mod error; + +#[cfg(feature = "native")] +pub use { + error::Error, kafka::topic::*, kafka::ExportedKafkaRecord, search::parse_search_query, + search::SearchQuery, +}; + +pub mod kafka; +pub mod search; +pub use kafka::Comparable; +pub use kafka::DataType; +pub use kafka::KafkaRecord; +pub use search::compare::StringOperator; +pub use search::FilterResult; diff --git a/crates/lib/src/search/atom.rs b/crates/lib/src/search/atom.rs new file mode 100644 index 0000000..2075b6d --- /dev/null +++ b/crates/lib/src/search/atom.rs @@ -0,0 +1,46 @@ +//! Atoms are the smallest unit of an expression. They can be a symbol, a comparison, a filter or a parenthesized expression. +use std::fmt::Display; + +use nom::{ + branch::alt, bytes::complete::tag, combinator::map, sequence::delimited, IResult, Parser, +}; + +use super::{ + compare::{parse_compare, CompareExpression}, + expression::{parse_or_expression, Expression}, + filter::{parse_filter, Filter}, + symbol::Symbol, + wsi::wsi, +}; + +#[derive(Debug, PartialEq, Clone)] +pub enum Atom { + Symbol(Symbol), + Compare(CompareExpression), + Filter(Filter), + Parenthesis(Box), +} + +pub(crate) fn parse_atom(input: &str) -> IResult<&str, Atom> { + alt(( + map(wsi(parse_filter), Atom::Filter), + map(wsi(parse_compare), Atom::Compare), + map( + delimited(wsi(tag("(")), parse_or_expression, wsi(tag(")"))), + |expr: Expression| Atom::Parenthesis(Box::new(expr)), + ), + //map(parse_symbol, Atom::Symbol), + )) + .parse(input) +} + +impl Display for Atom { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + Atom::Symbol(a) => write!(f, "{}", a), + Atom::Compare(a) => write!(f, "{}", a), + Atom::Parenthesis(a) => write!(f, "{}", a), + Atom::Filter(a) => write!(f, "{}", a), + } + } +} diff --git a/crates/lib/src/search/clause.rs b/crates/lib/src/search/clause.rs new file mode 100644 index 0000000..9a229a5 --- /dev/null +++ b/crates/lib/src/search/clause.rs @@ -0,0 +1,81 @@ +use nom::bytes::complete::tag_no_case; +/// Clauses are similar to clauses in the SQL language. +use nom::Parser; +use nom::{ + branch::alt, + combinator::{map, opt}, + sequence::{pair, preceded}, + IResult, +}; + +use super::expression::{parse_or_expression, Expression}; +use super::number::parse_unsigned_number; +use super::offset::{parse_from_offset, FromOffset}; +use super::order::{parse_order, parse_order_keyword, Order, OrderKeyword}; +use super::wsi::wsi; + +#[derive(Debug, Clone, PartialEq)] +pub(crate) enum SearchClause { + /// Clause that Limits the number of kafka records to consume + Limit(usize), + /// Clause containing the search expression + Expression(Expression), + /// Clause for telling the consumer where to start consuming from + From(FromOffset), + /// Clause defining how to sort the kafka records in the UI + OrderBy(Order, Option), +} + +pub(crate) fn parse_expression(input: &str) -> IResult<&str, SearchClause> { + map( + (opt(wsi(tag_no_case("where"))), wsi(parse_or_expression)), + |(_, e)| SearchClause::Expression(e), + ) + .parse(input) +} + +//pub(crate) fn parse_group_by_key(input: &str) -> IResult<&str, SearchClause> { +// value( +// SearchClause::GroupByKey, +// ( +// wsi(tag_no_case("group")), +// wsi(tag_no_case("by")), +// wsi(tag_no_case("key")), +// ), +// ) +// .parse(input) +//} + +pub(crate) fn parse_from_offset_clause(input: &str) -> IResult<&str, SearchClause> { + map(parse_from_offset, SearchClause::From).parse(input) +} + +pub(crate) fn parse_limit(input: &str) -> IResult<&str, SearchClause> { + map( + preceded(wsi(tag_no_case("limit")), wsi(parse_unsigned_number)), + SearchClause::Limit, + ) + .parse(input) +} + +pub(crate) fn parse_order_by(input: &str) -> IResult<&str, SearchClause> { + map( + preceded( + pair( + wsi(alt((tag_no_case("order"), tag_no_case("sort")))), + wsi(tag_no_case("by")), + ), + pair(parse_order, opt(parse_order_keyword)), + ), + |(o, oo)| SearchClause::OrderBy(o, oo), + ) + .parse(input) +} + +#[test] +fn test_parse_offset_clause() { + assert_eq!( + parse_from_offset_clause(r#"from end - 10"#), + Ok(("", SearchClause::From(FromOffset::OffsetTail(10)))) + ); +} diff --git a/crates/lib/src/search/compare/expression.rs b/crates/lib/src/search/compare/expression.rs new file mode 100644 index 0000000..5758853 --- /dev/null +++ b/crates/lib/src/search/compare/expression.rs @@ -0,0 +1,159 @@ +/// This module defines the parsers to comparison expressions. +/// ```bash +/// offset != 234 +/// key == "my-key" +/// timestamp between "2 hours ago" and "1 hour ago" +/// ``` +use std::fmt::Display; + +#[cfg(feature = "native")] +use chrono::{DateTime, Local}; +use nom::bytes::complete::tag_no_case; +use nom::Parser; +use nom::{ + branch::alt, + bytes::complete::tag, + combinator::{map, value}, + IResult, +}; + +use super::number::NumberOperator; +use super::string::StringOperator; + +#[cfg(feature = "native")] +#[derive(Debug, PartialEq, Clone, Eq)] +pub enum CompareExpression { + Partition(NumberOperator, i32), + OffsetTail(i64), + Offset(NumberOperator, i64), + Topic(StringOperator, String), + Key(StringOperator, String), + Value(Option, StringOperator, String), + Header(String, StringOperator, String), + Size(NumberOperator, i64), + Timestamp(NumberOperator, DateTime), + TimestampBetween(DateTime, DateTime), +} + +#[cfg(feature = "native")] +impl Display for CompareExpression { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + CompareExpression::Partition(op, r) => write!(f, "partition {} {}", op, r), + CompareExpression::OffsetTail(r) => write!(f, "offsetTail - {}", r), + CompareExpression::Offset(op, r) => write!(f, "offset {} {}", op, r), + CompareExpression::Topic(op, r) => write!(f, "topic {} {}", op, r), + CompareExpression::Key(op, r) => write!(f, "key {} {}", op, r), + CompareExpression::Value(left, op, r) => write!( + f, + "value{} {} {}", + left.clone().unwrap_or("".to_string()), + op, + r + ), + CompareExpression::Header(left, op, r) => { + write!(f, "headers.{} {} {}", left.clone(), op, r) + } + CompareExpression::Size(op, r) => write!(f, "size {} {}", op, r), + CompareExpression::Timestamp(op, r) => write!( + f, + r#"timestamp {} "{}""#, + op, + r.to_rfc3339_opts(chrono::SecondsFormat::Millis, false) + ), + CompareExpression::TimestampBetween(l, r) => write!( + f, + r#"timestamp between "{}" and "{}"#, + l.to_rfc3339_opts(chrono::SecondsFormat::Millis, false), + r.to_rfc3339_opts(chrono::SecondsFormat::Millis, false) + ), + } + } +} + +#[cfg(feature = "native")] +pub fn parse_compare(input: &str) -> IResult<&str, CompareExpression> { + use crate::search::{ + compare::string::parse_string_operator, + number::parse_number, + string::parse_string, + symbol::{ + parse_header_symbol, parse_key, parse_offset, parse_partition, parse_size, + parse_timestamp_symbol, parse_topic, parse_value_symbol, Symbol, + }, + timestamp::parse_timestamp, + wsi::wsi, + }; + + use super::number::parse_number_operator; + + alt(( + map( + (parse_offset, wsi(parse_number_operator), wsi(parse_number)), + |(_, op, r)| CompareExpression::Offset(op, r), + ), + map( + (parse_size, wsi(parse_number_operator), wsi(parse_number)), + |(_, op, r)| CompareExpression::Size(op, r), + ), + map( + ( + value(Symbol::OffsetTail, wsi(tag("offsetTail"))), + wsi(tag("==")), + wsi(parse_number), + ), + |(_, _, r)| CompareExpression::OffsetTail(r), + ), + map( + ( + parse_partition, + wsi(parse_number_operator), + wsi(parse_number), + ), + |(_, op, r)| CompareExpression::Partition(op, r as i32), + ), + map( + (parse_topic, wsi(parse_string_operator), wsi(parse_string)), + |(_, op, r)| CompareExpression::Topic(op, r), + ), + map( + (parse_key, wsi(parse_string_operator), wsi(parse_string)), + |(_, op, r)| CompareExpression::Key(op, r), + ), + map( + ( + parse_value_symbol, + wsi(parse_string_operator), + wsi(parse_string), + ), + |(left, op, r)| CompareExpression::Value(left.1, op, r), + ), + map( + ( + parse_header_symbol, + wsi(parse_string_operator), + wsi(parse_string), + ), + |(left, op, r)| CompareExpression::Header(left.1, op, r), + ), + map( + ( + parse_timestamp_symbol, + wsi(parse_number_operator), + wsi(parse_timestamp), + ), + |(_, op, r)| CompareExpression::Timestamp(op, r), + ), + map( + ( + parse_timestamp_symbol, + wsi(tag_no_case("between")), + wsi(parse_timestamp), + wsi(tag_no_case("and")), + wsi(parse_timestamp), + ), + |(_, _, from, _, to)| CompareExpression::TimestampBetween(from, to), + ), + )) + .parse(input) +} diff --git a/crates/lib/src/search/compare/mod.rs b/crates/lib/src/search/compare/mod.rs new file mode 100644 index 0000000..4957828 --- /dev/null +++ b/crates/lib/src/search/compare/mod.rs @@ -0,0 +1,16 @@ +#[cfg(feature = "native")] +pub mod expression; +#[cfg(feature = "native")] +pub mod number; +pub mod string; + +#[cfg(feature = "native")] +pub use expression::parse_compare; +#[cfg(feature = "native")] +pub use expression::CompareExpression; +#[cfg(feature = "native")] +pub use number::NumberOperator; +pub use string::StringOperator; + +#[cfg(test)] +pub mod mod_test; diff --git a/crates/lib/src/search/compare/mod_test.rs b/crates/lib/src/search/compare/mod_test.rs new file mode 100644 index 0000000..36885e7 --- /dev/null +++ b/crates/lib/src/search/compare/mod_test.rs @@ -0,0 +1,18 @@ +use crate::{search::compare::parse_compare, search::parse_search_query}; + +#[test] +fn test_parse_compare() { + assert!(parse_compare(r#"timestamp between "2024-05-28T17:55:08.145+02:00" and now"#).is_ok()); + assert!(parse_search_query( + r#"timestamp between "2024-05-28T17:55:08.145+02:00" and now from begin"# + ) + .is_ok()); +} + +#[test] +fn test_parse_search_query() { + assert!(parse_search_query( + r#"timestamp between "2024-05-28T17:55:08.145+02:00" and now from begin"# + ) + .is_ok()); +} diff --git a/crates/lib/src/search/compare/number.rs b/crates/lib/src/search/compare/number.rs new file mode 100644 index 0000000..1ac85f3 --- /dev/null +++ b/crates/lib/src/search/compare/number.rs @@ -0,0 +1,40 @@ +use std::fmt::Display; + +use crate::search::wsi::wsi; +use nom::Parser; +use nom::{branch::alt, bytes::complete::tag, combinator::value, IResult}; + +#[derive(Debug, PartialEq, Clone, Eq)] +pub enum NumberOperator { + GreaterThan, + GreaterOrEqual, + LowerThan, + LowerOrEqual, + Equal, + NotEqual, +} + +impl Display for NumberOperator { + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + match self { + Self::GreaterThan => write!(f, ">"), + Self::GreaterOrEqual => write!(f, ">="), + Self::LowerThan => write!(f, "<"), + Self::LowerOrEqual => write!(f, "<="), + Self::Equal => write!(f, "=="), + Self::NotEqual => write!(f, "!="), + } + } +} + +pub fn parse_number_operator(input: &str) -> IResult<&str, NumberOperator> { + alt(( + value(NumberOperator::GreaterOrEqual, wsi(tag(">="))), + value(NumberOperator::LowerOrEqual, wsi(tag("<="))), + value(NumberOperator::GreaterThan, wsi(tag(">"))), + value(NumberOperator::LowerThan, wsi(tag("<"))), + value(NumberOperator::Equal, wsi(tag("=="))), + value(NumberOperator::NotEqual, wsi(tag("!="))), + )) + .parse(input) +} diff --git a/crates/lib/src/search/compare/string.rs b/crates/lib/src/search/compare/string.rs new file mode 100644 index 0000000..bf98070 --- /dev/null +++ b/crates/lib/src/search/compare/string.rs @@ -0,0 +1,55 @@ +use std::fmt::Display; + +#[cfg(feature = "native")] +use crate::search::wsi::wsi; +#[cfg(feature = "native")] +use nom::{ + branch::alt, bytes::complete::tag, bytes::complete::tag_no_case, combinator::value, + sequence::pair, IResult, Parser, +}; + +#[derive(Debug, PartialEq, Clone, Eq)] +pub enum StringOperator { + Contain, + Equal, + NotEqual, + StartWith, +} + +impl Display for StringOperator { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + StringOperator::Contain => write!(f, "contains"), + StringOperator::Equal => write!(f, "=="), + StringOperator::NotEqual => write!(f, "!="), + StringOperator::StartWith => write!(f, "starts with"), + } + } +} + +#[cfg(feature = "native")] +pub fn parse_string_operator(input: &str) -> IResult<&str, StringOperator> { + alt(( + value(StringOperator::Contain, wsi(alt((tag("~="), tag("=~"))))), + value( + StringOperator::Contain, + wsi(alt(( + tag_no_case("contains"), + tag_no_case("c"), + tag_no_case("contain"), + tag_no_case("include"), + tag_no_case("includes"), + ))), + ), + value( + StringOperator::StartWith, + wsi(pair( + wsi(alt((tag_no_case("starts"), tag_no_case("start")))), + wsi(tag_no_case("with")), + )), + ), + value(StringOperator::Equal, wsi(tag("=="))), + value(StringOperator::NotEqual, wsi(tag("!="))), + )) + .parse(input) +} diff --git a/crates/lib/src/search/expression.rs b/crates/lib/src/search/expression.rs new file mode 100644 index 0000000..c4b84e9 --- /dev/null +++ b/crates/lib/src/search/expression.rs @@ -0,0 +1,122 @@ +/// Expressions represent booleans expression such as +/// ```js +/// offset == 23 and key == "my-key" +/// offset == 23 && key == "my-key" +/// +/// key starts with "1234-" or offset < 100 +/// key starts with "1234-" || offset < 100 +/// ``` +use std::fmt::Display; + +use nom::bytes::complete::tag_no_case; +use nom::Parser; +use nom::{ + branch::alt, bytes::complete::tag, combinator::map, multi::many0, sequence::preceded, IResult, +}; + +use super::term::{parse_term, Term}; +use super::wsi::wsi; + +// https://stackoverflow.com/questions/9509048/antlr-parser-for-and-or-logic-how-to-get-expressions-between-logic-operators +pub type Expression = OrExpression; +#[derive(Debug, PartialEq, Clone)] +pub enum AndExpression { + AndTerm(Term), + AndExpression(Vec), +} + +#[derive(Debug, PartialEq, Clone)] +pub enum OrExpression { + OrTerm(AndExpression), + OrExpression(Vec), +} + +impl Display for AndExpression { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + AndExpression::AndTerm(t) => write!(f, "{}", t), + AndExpression::AndExpression(e) => write!( + f, + "{}", + e.iter() + .map(|a| a.to_string()) + .collect::>() + .join(" && ") + ), + } + } +} + +impl Display for OrExpression { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + OrExpression::OrTerm(t) => write!(f, "{}", t), + OrExpression::OrExpression(t) => write!( + f, + "{}", + t.iter() + .map(|a| a.to_string()) + .collect::>() + .join(" || ") + ), + } + } +} + +impl OrExpression { + pub(crate) fn is_empty(&self) -> bool { + match self { + OrExpression::OrTerm(_) => false, + OrExpression::OrExpression(v) => v.is_empty(), + } + } +} + +/// Parses an or expression, operator is `||` or `or` +pub(crate) fn parse_or_expression(input: &str) -> IResult<&str, OrExpression> { + if input.trim().is_empty() { + return Ok(("", Expression::OrExpression(vec![]))); + } + map( + ( + wsi(parse_and_expression), + many0(preceded( + wsi(alt((tag("||"), tag("or")))), + wsi(parse_and_expression), + )), + ), + |(l, ee)| { + if ee.is_empty() { + OrExpression::OrTerm(l) + } else { + let mut ll = vec![l]; + ll.extend(ee); + OrExpression::OrExpression(ll) + } + }, + ) + .parse(input) +} + +/// Parses an or expression, operator is `&&` or `and` +pub(crate) fn parse_and_expression(input: &str) -> IResult<&str, AndExpression> { + map( + ( + wsi(parse_term), + many0(preceded( + wsi(alt((tag("&&"), tag_no_case("and")))), + wsi(parse_term), + )), + ), + |(l, ee)| { + if ee.is_empty() { + AndExpression::AndTerm(l) + } else { + let mut ll = vec![l]; + ll.extend(ee); + AndExpression::AndExpression(ll) + } + }, + ) + .parse(input) +} diff --git a/crates/lib/src/search/expression_test.rs b/crates/lib/src/search/expression_test.rs new file mode 100644 index 0000000..6b91c6a --- /dev/null +++ b/crates/lib/src/search/expression_test.rs @@ -0,0 +1,39 @@ +use crate::search::{ + atom::Atom, + compare::{CompareExpression, NumberOperator, StringOperator}, + expression::{parse_or_expression, AndExpression, Expression}, + term::Term, +}; + +#[test] +fn test_parse_term() { + assert_eq!( + parse_or_expression("!offset == 8"), + Ok(( + "", + Expression::OrTerm(AndExpression::AndTerm(Term::Not(Atom::Compare( + CompareExpression::Offset(NumberOperator::Equal, 8) + )))) + )) + ) +} + +#[test] +fn test_parse_and_expression() { + assert_eq!( + parse_or_expression("offset == 0 && topic == 'boite'"), + Ok(( + "", + Expression::OrTerm(AndExpression::AndExpression(vec!( + Term::Atom(Atom::Compare(CompareExpression::Offset( + NumberOperator::Equal, + 0 + ))), + Term::Atom(Atom::Compare(CompareExpression::Topic( + StringOperator::Equal, + "boite".to_string() + ))), + ))) + )) + ) +} diff --git a/crates/lib/src/search/filter.rs b/crates/lib/src/search/filter.rs new file mode 100644 index 0000000..459258e --- /dev/null +++ b/crates/lib/src/search/filter.rs @@ -0,0 +1,119 @@ +//! Filters allow developers to extend the search engine. +//! +//! Syntactically speaking, a filter looks like this a function call: +//! ```sql +//! from beginning offset > 50 && contains("rust") +//! ``` +//! +//! In the example, the filter `contains` take 1 string parameter. Filters support string and number parameters for now. +//! Let's define this `contains` filter. +//! +//! this filter is not Rust code, nor assembly code but a wasm module. The wasm module must have the following requirements: +//! - the name of the wasm file (`contains.wasm`) corresponds to the name of the filter. +//! - the wasm module must implement 2 functions: +//! - `fn matches(input: Input): bool` - this function returns `true` if the kafka record matches the condition. +//! - `fn parse_parameters(params: Vec): bool` - this function is optional: it returns `true` when the parameters are valid. Parameters are serialized to an JSON array. +//! +//! The library uses [Extism](https://extism.org/) to develop wasm modules. +//! You can also find the source code of the `contains` WebAssembly module written in different supported programming languages. + +use itertools::Itertools; +use nom::{ + branch::alt, + bytes::complete::tag, + character::complete::{alphanumeric1, one_of}, + combinator::{map, recognize}, + multi::{many1, separated_list0}, + sequence::delimited, + IResult, Parser, +}; +use serde::{Deserialize, Serialize}; +use serde_json::{Number, Value}; +use std::fmt::Display; + +use crate::KafkaRecord; + +use super::{number::parse_number, string::parse_string, wsi::wsi}; + +#[derive(Debug, PartialEq, Clone, Default)] +pub struct Filter { + pub name: String, + pub parameters: Vec, +} + +impl Display for Filter { + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + write!( + f, + "{}({})", + self.name, + self.parameters.iter().map(|e| e.to_string()).join(", ") + ) + } +} + +#[derive(Debug, PartialEq, Clone, Eq)] +pub enum Parameter { + Number(i64), + String(String), +} + +impl Display for Parameter { + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + match self { + Parameter::Number(i) => write!(f, "{}", i), + Parameter::String(i) => write!(f, "'{}'", i), + } + } +} + +impl Parameter { + pub fn json(&self) -> Value { + match self { + Parameter::Number(i) => Value::Number(Number::from(*i)), + Parameter::String(i) => Value::String(i.to_string()), + } + } +} + +fn parse_filter_name(input: &str) -> IResult<&str, String> { + map( + recognize(wsi(many1(alt((alphanumeric1, recognize(one_of("_-"))))))), + |d: &str| d.to_string(), + ) + .parse(input) +} + +fn parse_parameter(input: &str) -> IResult<&str, Parameter> { + wsi(alt(( + map(parse_number, Parameter::Number), + map(parse_string, Parameter::String), + ))) + .parse(input) +} + +pub(crate) fn parse_filter(input: &str) -> IResult<&str, Filter> { + let (remaining, (name, params)) = ( + parse_filter_name, + delimited( + wsi(tag("(")), + separated_list0(wsi(tag(",")), parse_parameter), + wsi(tag(")")), + ), + ) + .parse(input)?; + Ok(( + remaining, + Filter { + name, + parameters: params, + }, + )) +} + +#[derive(Clone, Debug, Deserialize, Serialize, PartialEq, Eq)] +#[serde(rename_all = "lowercase")] +pub struct FilterInput { + pub record: KafkaRecord, + pub params: Vec, +} diff --git a/crates/lib/src/search/mod.rs b/crates/lib/src/search/mod.rs new file mode 100644 index 0000000..a43c116 --- /dev/null +++ b/crates/lib/src/search/mod.rs @@ -0,0 +1,95 @@ +//! This module defines the the parsing functions of search query. +//! The grammar of the syntax is the following: +//! +//! ```bnf +//! search-query ::= clause+ +//! clause ::= or-expression | limit-clause | from-clause | order-clause +//! or-expression ::= And-expression | and-expression 'or' and-expression +//! and-expression ::= atom | atom 'and' atom +//! term ::= atom | '!' atom +//! atom ::= comparison | filter | '(' expression ')' +//! number-symbol ::= 'offset' | 'partition' | 'size' +//! string-symbol ::= 'topic' | 'key' | 'timestamp' | 'value' +//! symbol ::= number-symbol | string-symbol +//! comparison ::= number-comparison | string-comparison | time-comparison +//! number-comparison ::= number-symbol number-operator number +//! string-comparison ::= string-symbol string-operator string +//! time-comparison ::= 'between' string 'and' string +//! number-operator ::= '==' | '!=' | '>' | '<' | '>=' | '<=' +//! string-operator ::= 'starts with' | '==' | '!=' | '=~' | 'contains' | 'contain' | 'includes' | 'include' +//! filter ::= .+ '('filter-parameters')' +//! filter-parameter ::= string | number +//! filter-parameters ::= filter-parameter (',' filter-parameter)* +//! limit-clause ::= 'limit' number +//! order-clause ::= 'order by' symbol order-keyword +//! order-keyword ::= 'asc' | 'desc' +//! from-clause ::= 'from' offset +//! offset ::= 'beginning' | 'begin' | 'end' | 'end' '-' number | string | number +//! number ::= [0-9_]+ +//! string ::= '"' [^"]+ '"' | "'" [^']+ "'" +//! ``` +//! You can use to visualize it. + +#[cfg(feature = "native")] +pub mod atom; +#[cfg(feature = "native")] +pub mod clause; +#[cfg(feature = "native")] +pub mod expression; +#[cfg(feature = "native")] +pub mod filter; +#[cfg(feature = "native")] +pub mod number; +#[cfg(feature = "native")] +pub mod offset; +#[cfg(feature = "native")] +pub mod order; +#[cfg(feature = "native")] +pub mod search_query; +#[cfg(feature = "native")] +pub mod string; +#[cfg(feature = "native")] +pub mod symbol; +#[cfg(feature = "native")] +pub mod term; +#[cfg(feature = "native")] +pub mod timestamp; +#[cfg(feature = "native")] +pub mod wsi; + +pub mod compare; + +#[cfg(feature = "native")] +pub use order::Order; +#[cfg(feature = "native")] +pub use order::OrderBy; +#[cfg(feature = "native")] +pub use search_query::parse_search_query; +#[cfg(feature = "native")] +pub use search_query::SearchQuery; +use serde::Deserialize; +use serde::Serialize; + +#[cfg(test)] +pub mod expression_test; +#[cfg(test)] +pub mod number_test; +#[cfg(test)] +pub mod offset_test; + +#[derive(Debug, PartialEq, Clone, Default, Deserialize, Serialize)] +pub struct FilterResult { + pub r#match: bool, +} + +impl FilterResult { + pub fn new(r#match: bool) -> Self { + Self { r#match } + } +} + +impl From for FilterResult { + fn from(r#match: bool) -> Self { + Self { r#match } + } +} diff --git a/crates/lib/src/search/number.rs b/crates/lib/src/search/number.rs new file mode 100644 index 0000000..312e6a6 --- /dev/null +++ b/crates/lib/src/search/number.rs @@ -0,0 +1,45 @@ +use nom::character::complete::char; +use nom::multi::many0; +use nom::Parser; +use nom::{ + character::complete::digit1, + combinator::{map_res, opt, recognize}, + sequence::pair, + IResult, +}; + +/// Parses an unsigned number. +pub(crate) fn parse_unsigned_number(input: &str) -> IResult<&str, usize> { + map_res(parse_unsigned_number_as_string, |d: &str| { + d.replace('_', "").parse() + }) + .parse(input) +} + +/// Parses an unsigned number. +/// The number can contains '_' for readability purposes. +pub(crate) fn parse_unsigned_number_as_string(input: &str) -> IResult<&str, &str> { + recognize(pair(digit1, many0((char('_'), digit1)))).parse(input) +} + +/// Parses a signed number. +pub(crate) fn parse_number(input: &str) -> IResult<&str, i64> { + map_res( + pair(opt(char('-')), parse_unsigned_number_as_string), + |(o, n)| { + let n = n.replace('_', "").parse::(); + if o.is_some() { + return n.map(|e| -e); + } + n + }, + ) + .parse(input) +} + +#[test] +fn test_parse_number() { + assert_eq!(parse_number("10"), Ok(("", 10))); + assert_eq!(parse_number("10_0"), Ok(("", 100))); + assert_eq!(parse_number("-10_0"), Ok(("", -100))); +} diff --git a/crates/lib/src/search/number_test.rs b/crates/lib/src/search/number_test.rs new file mode 100644 index 0000000..4b0e0e3 --- /dev/null +++ b/crates/lib/src/search/number_test.rs @@ -0,0 +1,18 @@ +use crate::search::number::{parse_number, parse_unsigned_number_as_string}; + +#[test] +fn test_parse_unsigned_number() { + assert!(parse_number("2343").is_ok()); +} + +#[test] +fn test_parse_unsigned_number_as_string() { + assert!(parse_unsigned_number_as_string("2_343").is_ok()); +} + +#[test] +fn test_parse_number() { + assert_eq!(parse_number("10"), Ok(("", 10))); + assert_eq!(parse_number("10_0"), Ok(("", 100))); + assert_eq!(parse_number("-10_0"), Ok(("", -100))); +} diff --git a/crates/lib/src/search/offset.rs b/crates/lib/src/search/offset.rs new file mode 100644 index 0000000..dfce4f8 --- /dev/null +++ b/crates/lib/src/search/offset.rs @@ -0,0 +1,77 @@ +use std::fmt::Display; + +use nom::{ + branch::alt, + bytes::{complete::tag, tag_no_case}, + combinator::{map, value}, + sequence::preceded, + IResult, Parser, +}; + +use super::{ + number::parse_number, + symbol::{parse_end_keyword, parse_offset}, + timestamp::parse_timestamp, + wsi::wsi, +}; + +/// A kafka offset. +#[derive(Debug, Clone, PartialEq, Eq)] +pub enum FromOffset { + /// Start consuming from the beginning of the partition. + Beginning, + /// Start consuming from the end of the partition. + End, + /// A specific offset to consume from. + Offset(i64), + /// An offset relative to the end of the partition. + OffsetTail(i64), + /// Start consuming from a specific timestamp end of the partition. + Timestamp(i64), +} + +impl Display for FromOffset { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + FromOffset::Beginning => write!(f, "beginning"), + FromOffset::End => write!(f, "end"), + FromOffset::Offset(o) => write!(f, "{}", o), + FromOffset::OffsetTail(o) => write!(f, "end - {}", o), + FromOffset::Timestamp(_) => write!(f, ""), + } + } +} + +/// parses the clause defining from where the consumer should starting reading records. +/// ```text +/// from begin +/// from end +/// from "3 hours ago" +/// from 34895 +/// from -10 +/// ``` +pub(crate) fn parse_from_offset(input: &str) -> IResult<&str, FromOffset> { + preceded( + wsi(tag_no_case("from")), + alt(( + map(wsi(parse_timestamp), |t| { + FromOffset::Timestamp(t.to_utc().timestamp_millis()) + }), + value( + FromOffset::Beginning, + wsi(alt((tag("beginning"), tag_no_case("begin")))), + ), + map( + (parse_offset, wsi(tag("==")), wsi(parse_number)), + |(_, _, d)| FromOffset::Offset(d), + ), + map(wsi(parse_number), FromOffset::Offset), + map( + (parse_end_keyword, wsi(tag("-")), wsi(parse_number)), + |(_, _, r)| FromOffset::OffsetTail(r), + ), + value(FromOffset::End, parse_end_keyword), + )), + ) + .parse(input) +} diff --git a/crates/lib/src/search/offset_test.rs b/crates/lib/src/search/offset_test.rs new file mode 100644 index 0000000..242e060 --- /dev/null +++ b/crates/lib/src/search/offset_test.rs @@ -0,0 +1,11 @@ +use crate::search::offset::parse_from_offset; + +#[test] +fn test_parse_from_offset() { + assert!(parse_from_offset(r#"from "2024-05-28T17:55:08.145+02:00""#).is_ok()); +} + +#[test] +fn test_parse_from_end_minus_number() { + assert!(parse_from_offset(r#"from end - 10"#).is_ok()); +} diff --git a/crates/lib/src/search/order.rs b/crates/lib/src/search/order.rs new file mode 100644 index 0000000..8ae1d93 --- /dev/null +++ b/crates/lib/src/search/order.rs @@ -0,0 +1,138 @@ +use nom::Parser; +use nom::{branch::alt, bytes::complete::tag, combinator::value, IResult}; + +use super::symbol::{ + parse_key, parse_offset, parse_partition, parse_size, parse_timestamp_symbol, parse_topic, + parse_value, Symbol, +}; +use super::wsi::wsi; + +/// This struct is only used when you start the TUI. +/// You can order kafka records in the terminal as you could do with SQL. +/// +/// ```sql +/// order by key desc +/// sort by partition asc +/// ``` +#[derive(Debug, Clone, PartialEq, Eq, Default)] +pub struct OrderBy { + pub order: Order, + pub keyword: OrderKeyword, +} + +#[derive(Debug, Clone, PartialEq, Eq)] +pub enum OrderKeyword { + Desc, + Asc, +} + +impl Default for Order { + fn default() -> Self { + Self::Timestamp + } +} + +impl Default for OrderKeyword { + fn default() -> Self { + Self::Asc + } +} + +/// You can order kafka records by the following fields. +#[derive(Debug, Clone, PartialEq, Eq)] +pub enum Order { + Timestamp, + Key, + Value, + Partition, + Offset, + Size, + Topic, +} + +impl OrderBy { + pub fn new(order: Order, keyword: OrderKeyword) -> Self { + Self { order, keyword } + } + + pub fn is_descending(&self) -> bool { + self.keyword == OrderKeyword::Desc + } +} + +pub(crate) fn parse_order(input: &str) -> IResult<&str, Order> { + let t = wsi(alt(( + parse_size, + parse_timestamp_symbol, + parse_offset, + parse_key, + parse_value, + parse_topic, + parse_partition, + ))) + .parse(input)?; + + let o = match t.1 { + Symbol::Offset => Order::Offset, + Symbol::Key => Order::Key, + Symbol::Topic => Order::Topic, + Symbol::Value(_) => Order::Value, + Symbol::Partition => Order::Partition, + Symbol::OffsetTail => unreachable!("nope"), + Symbol::Size => Order::Size, + Symbol::Timestamp => Order::Timestamp, + Symbol::Header(_) => unreachable!("nope"), + }; + Ok((t.0, o)) +} + +pub(crate) fn parse_order_keyword(input: &str) -> IResult<&str, OrderKeyword> { + alt(( + value(OrderKeyword::Asc, wsi(tag("asc"))), + value(OrderKeyword::Desc, wsi(tag("desc"))), + )) + .parse(input) +} + +impl std::fmt::Display for OrderBy { + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + write!(f, "order by {} {}", self.order, self.keyword) + } +} + +impl std::fmt::Display for Order { + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + let order = match self { + Order::Timestamp => "timestamp", + Order::Key => "key", + Order::Value => "value", + Order::Partition => "partition", + Order::Offset => "offset", + Order::Size => "size", + Order::Topic => "topic", + }; + write!(f, "{}", order) + } +} + +impl std::fmt::Display for OrderKeyword { + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + let keyword = match self { + OrderKeyword::Desc => "desc", + OrderKeyword::Asc => "asc", + }; + write!(f, "{}", keyword) + } +} + +#[test] +fn test_parse_order() { + assert_eq!(parse_order(r#"partition"#), Ok(("", Order::Partition))); + assert!(parse_order(r#"!value"#).is_err()); +} + +#[test] +fn test_parse_order_keyword() { + assert_eq!(parse_order_keyword(r#"asc"#), Ok(("", OrderKeyword::Asc))); + assert_eq!(parse_order_keyword(r#"desc"#), Ok(("", OrderKeyword::Desc))); +} diff --git a/crates/lib/src/search/search_query.rs b/crates/lib/src/search/search_query.rs new file mode 100644 index 0000000..8892fe4 --- /dev/null +++ b/crates/lib/src/search/search_query.rs @@ -0,0 +1,110 @@ +use itertools::Itertools; +use nom::{ + branch::alt, + combinator::{eof, map}, + multi::many_till, + Parser, +}; + +use crate::error::SearchError; + +use super::{ + clause::{ + parse_expression, parse_from_offset_clause, parse_limit, parse_order_by, SearchClause, + }, + expression::Expression, + offset::FromOffset, + order::{Order, OrderBy, OrderKeyword}, + wsi::wsi, +}; + +/// A `SearchQuery` is a combination of a expression, a limit, an offset and an order by clause. +#[derive(Debug, Clone, PartialEq)] +pub struct SearchQuery { + pub expression: Expression, + pub limit: Option, + pub from: Option, + pub order_by: OrderBy, + //pub group_by_key: bool, +} + +impl SearchQuery { + pub fn is_empty(&self) -> bool { + self.limit.is_none() && self.from.is_none() && self.expression.is_empty() + } +} + +impl std::fmt::Display for SearchQuery { + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + let mut clauses = vec![]; + + let from = match &self.from { + Some(f) => format!("from {}", f), + None => "".to_string(), + }; + let limit = match self.limit { + Some(i) => format!("limit {}", i), + None => "".to_string(), + }; + clauses.push(from.to_string()); + clauses.push(format!("{}", self.expression)); + clauses.push(format!("{}", self.order_by)); + clauses.push(limit.to_string()); + let clauses = clauses.into_iter().filter(|e| !e.is_empty()).collect_vec(); + write!(f, "{}", clauses.join(" ")) + } +} + +impl Default for SearchQuery { + fn default() -> Self { + Self { + expression: Expression::OrExpression(vec![]), + limit: None, + from: None, + order_by: OrderBy::new(Order::Timestamp, OrderKeyword::Asc), + //group_by_key: false, + } + } +} + +pub fn parse_search_query(input: &str) -> Result<(&str, SearchQuery), SearchError> { + map( + many_till( + alt(( + parse_from_offset_clause, + parse_limit, + parse_expression, + parse_order_by, + )), + wsi(eof), + ), + |clauses| { + let mut s = SearchQuery::default(); + for c in clauses.0 { + match c { + SearchClause::Limit(i) => s.limit = Some(i), + SearchClause::From(f) => s.from = Some(f), + SearchClause::Expression(u) => s.expression = u, + SearchClause::OrderBy(order, k) => { + s.order_by = OrderBy::new(order, k.unwrap_or(OrderKeyword::Asc)) + } //SearchClause::GroupByKey => s.group_by_key = true, + } + } + s + }, + ) + .parse(input) + .map_err(|e| { + let remaining = match e { + nom::Err::Incomplete(_) => input.to_string(), + nom::Err::Error(s) => s.input.to_string(), + nom::Err::Failure(s) => s.input.to_string(), + }; + SearchError::Parse(remaining) + }) +} + +#[test] +fn test_parse_search_query() { + assert!(parse_search_query(r#" from end - 10"#).is_ok()); +} diff --git a/crates/lib/src/search/string.rs b/crates/lib/src/search/string.rs new file mode 100644 index 0000000..b9d875c --- /dev/null +++ b/crates/lib/src/search/string.rs @@ -0,0 +1,29 @@ +use nom::{ + branch::alt, + bytes::complete::{tag, take_until}, + combinator::map, + sequence::delimited, + IResult, Parser, +}; + +/// A string is delimited by single or double quotes. +/// ```text +/// "this is a string" +/// 'this a another string' +/// ``` +pub(crate) fn parse_string(input: &str) -> IResult<&str, String> { + map( + alt(( + delimited(tag("\""), take_until("\""), tag("\"")), + delimited(tag("'"), take_until("'"), tag("'")), + )), + |d: &str| d.to_string(), + ) + .parse(input) +} + +#[test] +fn test_parse_string() { + assert_eq!(parse_string(r#"'halo'"#), Ok(("", "halo".to_string()))); + assert_eq!(parse_string(r#""hola""#), Ok(("", "hola".to_string()))); +} diff --git a/crates/lib/src/search/symbol.rs b/crates/lib/src/search/symbol.rs new file mode 100644 index 0000000..91276f7 --- /dev/null +++ b/crates/lib/src/search/symbol.rs @@ -0,0 +1,104 @@ +//! Symbols built-in variables representing each attribute of a kafka record +//! Symbols have also aliases: 't' for 'topic, 'o' for 'offset'... +use nom::{ + branch::alt, + bytes::complete::{tag, tag_no_case, take_while}, + combinator::{map, opt, recognize, value}, + error::ErrorKind, + sequence::preceded, + IResult, Parser, +}; +use strum::Display; + +use super::wsi::wsi; + +#[derive(Debug, Display, PartialEq, Eq, Clone)] +pub enum Symbol { + Offset, + Topic, + Partition, + OffsetTail, + Key, + Size, + Timestamp, + Value(Option), + Header(String), +} + +// pub(crate) fn parse_symbol(input: &str) -> IResult<&str, Symbol> { +// alt(( +// parse_offset, +// parse_timestamp_symbol, +// parse_topic, +// parse_partition, +// parse_key, +// parse_size, +// parse_value, +// map(parse_value_symbol, |e| e.0), +// map(parse_header_symbol, |e| e.0), +// )) +// .parse(input) +// } + +pub(crate) fn parse_offset(input: &str) -> IResult<&str, Symbol> { + value(Symbol::Offset, wsi(alt((tag("offset"), tag("o"))))).parse(input) +} + +pub(crate) fn parse_size(input: &str) -> IResult<&str, Symbol> { + value(Symbol::Size, wsi(alt((tag("size"), tag("si"))))).parse(input) +} + +pub(crate) fn parse_partition(input: &str) -> IResult<&str, Symbol> { + value(Symbol::Partition, wsi(alt((tag("partition"), tag("p"))))).parse(input) +} + +pub(crate) fn parse_value(input: &str) -> IResult<&str, Symbol> { + value(Symbol::Value(None), wsi(alt((tag("value"), tag("v"))))).parse(input) +} + +pub(crate) fn parse_topic(input: &str) -> IResult<&str, Symbol> { + value(Symbol::Topic, wsi(alt((tag("topic"), tag("t"))))).parse(input) +} + +pub(crate) fn parse_key(input: &str) -> IResult<&str, Symbol> { + value(Symbol::Key, wsi(alt((tag("key"), tag("k"))))).parse(input) +} + +pub(crate) fn parse_timestamp_symbol(input: &str) -> IResult<&str, Symbol> { + value(Symbol::Timestamp, wsi(alt((tag("timestamp"), tag("ts"))))).parse(input) +} + +pub(crate) fn parse_value_symbol(input: &str) -> IResult<&str, (Symbol, Option)> { + map( + preceded(wsi(alt((tag("value"), tag("v")))), opt(parse_json_path)), + |json_path| (Symbol::Value(json_path.clone()), json_path), + ) + .parse(input) +} + +pub(crate) fn parse_header_symbol(input: &str) -> IResult<&str, (Symbol, String)> { + map( + preceded(alt((wsi(tag("headers")), wsi(tag("h")))), parse_json_path), + |json_path| { + let t = json_path.replace('.', ""); + (Symbol::Header(t.clone()), t) + }, + ) + .parse(input) +} + +/// Parse a JSON Pointer, producing a list of decoded segments. +pub(crate) fn parse_json_path(input: &str) -> IResult<&str, String> { + let (remaining, json_path) = recognize(take_while(|ch| ch != ' ')).parse(input)?; + match json_path.is_empty() { + true => Err(nom::Err::Error(nom::error::Error::new( + remaining, + ErrorKind::Fail, + ))), + false => Ok((remaining, json_path.to_string())), + } +} + +pub(crate) fn parse_end_keyword(input: &str) -> IResult<&str, ()> { + map(wsi(alt((tag_no_case("end"), tag_no_case("now")))), |_| ()).parse(input) +} diff --git a/crates/lib/src/search/symbol_test.rs b/crates/lib/src/search/symbol_test.rs new file mode 100644 index 0000000..972e3e8 --- /dev/null +++ b/crates/lib/src/search/symbol_test.rs @@ -0,0 +1,43 @@ +use crate::search::symbol::{parse_symbol, Symbol}; + +#[test] +fn test_parse_value() { + assert_eq!(parse_symbol(r#"value"#), Ok(("", Symbol::Value(None)))); + assert_eq!(parse_symbol(r#"v"#), Ok(("", Symbol::Value(None)))); +} + +#[test] +fn test_parse_topic() { + assert_eq!(parse_symbol(r#"topic"#), Ok(("", Symbol::Topic))); + assert_eq!(parse_symbol(r#"t"#), Ok(("", Symbol::Topic))); +} + +#[test] +fn test_parse_key() { + assert_eq!(parse_symbol(r#"key"#), Ok(("", Symbol::Key))); + assert_eq!(parse_symbol(r#"k"#), Ok(("", Symbol::Key))); +} + +#[test] +fn test_parse_partition() { + assert_eq!(parse_symbol(r#"partition"#), Ok(("", Symbol::Partition))); + assert_eq!(parse_symbol(r#"p"#), Ok(("", Symbol::Partition))); +} + +#[test] +fn test_parse_offset() { + assert_eq!(parse_symbol(r#"offset"#), Ok(("", Symbol::Offset))); + assert_eq!(parse_symbol(r#"o"#), Ok(("", Symbol::Offset))); +} + +#[test] +fn test_parse_timestamp() { + assert_eq!(parse_symbol(r#"timestamp"#), Ok(("", Symbol::Timestamp))); + assert_eq!(parse_symbol(r#"ts"#), Ok(("", Symbol::Timestamp))); +} + +#[test] +fn test_parse_size() { + assert_eq!(parse_symbol(r#"size"#), Ok(("", Symbol::Size))); + assert_eq!(parse_symbol(r#"si"#), Ok(("", Symbol::Size))); +} diff --git a/crates/lib/src/search/term.rs b/crates/lib/src/search/term.rs new file mode 100644 index 0000000..ff2618e --- /dev/null +++ b/crates/lib/src/search/term.rs @@ -0,0 +1,53 @@ +use std::fmt::Display; + +use nom::{ + branch::alt, bytes::complete::tag, combinator::map, sequence::preceded, IResult, Parser, +}; + +use super::{ + atom::{parse_atom, Atom}, + wsi::wsi, +}; + +/// A term is either: +/// - An atom, +/// - Or a negative atom. +///```sql +/// !(offset > 50) +/// offset > 50 +/// ``` +#[derive(Debug, PartialEq, Clone)] +pub enum Term { + Not(Atom), + Atom(Atom), +} + +pub(crate) fn parse_term(input: &str) -> IResult<&str, Term> { + alt(( + map(preceded(wsi(tag("!")), parse_atom), Term::Not), + map(parse_atom, Term::Atom), + )) + .parse(input) +} + +impl Display for Term { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + Term::Not(a) => write!(f, "!{}", a), + Term::Atom(a) => write!(f, "{}", a), + } + } +} + +//#[test] +//fn test_parse() { +// use crate::search::symbol::Symbol; +// assert_eq!( +// parse_term(r#"!partition"#), +// Ok(("", Term::Not(Atom::Symbol(Symbol::Partition)))) +// ); +// assert_eq!( +// parse_term(r#"topic"#), +// Ok(("", Term::Atom(Atom::Symbol(Symbol::Topic)))) +// ); +//} diff --git a/crates/lib/src/search/timestamp.rs b/crates/lib/src/search/timestamp.rs new file mode 100644 index 0000000..6bebe08 --- /dev/null +++ b/crates/lib/src/search/timestamp.rs @@ -0,0 +1,39 @@ +use chrono::{DateTime, Local}; +use fuzzydate::parse; +use nom::{ + branch::alt, + bytes::complete::tag_no_case, + combinator::{map_res, value}, + IResult, Parser, +}; + +use super::{string::parse_string, wsi::wsi}; + +/// Parses a timestamp. +/// It can be a RFC3339 date time +/// or a fuzzy date (`3 hours ago`) or the 'now' keyword. +/// +/// ```text +/// "3 hours ago" +/// '2024-09-19T17:59:25.815+02:00' +/// now +/// ``` +pub(crate) fn parse_timestamp(input: &str) -> IResult<&str, DateTime> { + alt(( + map_res(parse_string, |s| { + DateTime::parse_from_rfc3339(&s).map(|d| d.with_timezone(&Local)) + }), + map_res(parse_string, |s| { + parse(s).map(|d| d.and_local_timezone(Local).unwrap()) + }), + value(Local::now(), wsi(tag_no_case("now"))), + )) + .parse(input) +} + +#[test] +fn test_parse_timestamp() { + assert!(parse_timestamp(r#"'3 hours ago'"#).is_ok()); + assert!(parse_timestamp(r#"now"#).is_ok()); + assert!(parse_timestamp(r#""2024-09-17T06:44:59Z""#).is_ok()); +} diff --git a/crates/lib/src/search/wsi.rs b/crates/lib/src/search/wsi.rs new file mode 100644 index 0000000..6617dda --- /dev/null +++ b/crates/lib/src/search/wsi.rs @@ -0,0 +1,27 @@ +use nom::{ + branch::alt, + bytes::complete::tag, + character::complete::{line_ending, space1}, + combinator::value, + error::ParseError, + multi::many0, + sequence::preceded, + AsChar, Compare, Input, Parser, +}; + +/// Gets rid of spaces, tabs and backslash + newline. +pub(crate) fn wsi>(inner: F) -> impl Parser +where + I: Clone + Input, + I: Compare<&'static str>, + ::Item: AsChar, + F: Parser, +{ + preceded( + value( + (), + many0(alt((preceded(tag("\\"), line_ending), line_ending, space1))), + ), + inner, + ) +} diff --git a/crates/lib/tests/inputs/1.sql b/crates/lib/tests/inputs/1.sql new file mode 100644 index 0000000..5829121 --- /dev/null +++ b/crates/lib/tests/inputs/1.sql @@ -0,0 +1,5 @@ + from beginning +where my-wasm-filter("cool", "cat") + or (timestamp between "2024-05-28T17:55:08.145+02:00" and now and value contains "foundation") +order by key desc +limit 1_000 \ No newline at end of file diff --git a/crates/lib/tests/inputs/2.sql b/crates/lib/tests/inputs/2.sql new file mode 100644 index 0000000..6cfcd12 --- /dev/null +++ b/crates/lib/tests/inputs/2.sql @@ -0,0 +1,7 @@ + from end - 5_000 +where value contains "uv" + and k contains "foobar" + or t == "french-recipes" + and !(partition != 1) +order by timestamp asc +limit 100 \ No newline at end of file diff --git a/crates/lib/tests/inputs/3.sql b/crates/lib/tests/inputs/3.sql new file mode 100644 index 0000000..d0847f1 --- /dev/null +++ b/crates/lib/tests/inputs/3.sql @@ -0,0 +1,7 @@ +where ((topic == "system" and key contains "restart") + or !(value starts with "ignored" and partition > 2)) + and (timestamp between "3 hours ago" and "20 minutes ago") + and myFilter("check", "error", 500) or myFilter("type", 100) +order by key desc +limit 50 +from beginning \ No newline at end of file diff --git a/crates/lib/tests/inputs/record-1.json b/crates/lib/tests/inputs/record-1.json new file mode 100644 index 0000000..eb39652 --- /dev/null +++ b/crates/lib/tests/inputs/record-1.json @@ -0,0 +1,44 @@ +{ + "value": { + "type": "Feature", + "geometry": { + "type": "Point", + "coordinates": [ + 4.0215, + 49.211613 + ] + }, + "properties": { + "label": "Rue Franz Kafka 51100 Reims", + "score": 0.6992354545454544, + "id": "51454_3782", + "name": "Rue Franz Kafka", + "postcode": "51100", + "citycode": "51454", + "x": 774438.44, + "y": 6901804.03, + "city": "Reims", + "context": "51, Marne, Grand Est", + "type": "street", + "importance": 0.69159, + "street": "Rue Franz Kafka" + } + }, + "key": "381", + "topic": "public-french-addresses", + "timestamp": 1732481153241, + "partition": 0, + "offset": 4, + "headers": {}, + "key_schema": { + "id": 1 + }, + "value_schema": { + "id": 2 + }, + "size": 381, + "date_time": "2024-11-24T21:45:53.241+01:00", + "absolute_delta_in_ms": 0, + "relative_delta_in_ms": 0, + "search_query": "" + } \ No newline at end of file diff --git a/crates/lib/tests/mod.rs b/crates/lib/tests/mod.rs new file mode 100644 index 0000000..e5febbc --- /dev/null +++ b/crates/lib/tests/mod.rs @@ -0,0 +1,33 @@ +use insta::{assert_debug_snapshot, glob}; +use std::fs; +use yozefu_lib::{parse_search_query, ExportedKafkaRecord, KafkaRecord}; + +#[test] +fn test_inputs() { + glob!("inputs/*.sql", |path| { + unsafe { + use std::env; + // Set the timezone to Paris to have a fixed timezone for the tests + env::set_var("TZ", "Europe/Paris"); + } + + let input = fs::read_to_string(path).unwrap(); + let input = input.trim(); + insta::with_settings!({ + description => input.replace("\n", " "), + filters => vec![ + ("[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}:[0-9]{2}:[0-9]{2}\\.[0-9]{6,}\\+[0-9]{2}:[0-9]{2}", "[datetime]"), + ]}, { + assert_debug_snapshot!(parse_search_query(input)); + }); + }); +} + +#[test] +fn test_exported_record() { + glob!("inputs/record*.json", |path| { + let input = fs::read_to_string(path).unwrap(); + let record: KafkaRecord = serde_json::from_str(&input).unwrap(); + assert_debug_snapshot!(ExportedKafkaRecord::from(&record)); + }); +} diff --git a/crates/lib/tests/protobuf.rs b/crates/lib/tests/protobuf.rs new file mode 100644 index 0000000..1c540d2 --- /dev/null +++ b/crates/lib/tests/protobuf.rs @@ -0,0 +1,18 @@ +//! Work in progress + +// +// ```bash +// bash docs/try-it.sh +// docker compose exec -T schema-registry bash +// kafka-protobuf-console-producer --bootstrap-server kafka:19092 \ +// --property schema.registry.url=http://localhost:8082 --topic transactions-proto \ +// --property value.schema='syntax = "proto3"; message MyRecord { string id = 1; float amount = 2;}' +// { "id":"1000", "amount":500 } +// ``` + +// https://github.com/confluentinc/schema-registry/blob/master/protobuf-provider/src/main/java/io/confluent/kafka/schemaregistry/protobuf/ProtobufSchemaUtils.java + +#[test] +fn test_protobuf_to_json() -> Result<(), ()> { + Ok(()) +} diff --git a/crates/lib/tests/snapshots/r#mod__exported_record.snap b/crates/lib/tests/snapshots/r#mod__exported_record.snap new file mode 100644 index 0000000..50dc752 --- /dev/null +++ b/crates/lib/tests/snapshots/r#mod__exported_record.snap @@ -0,0 +1,71 @@ +--- +source: crates/lib/tests/mod.rs +expression: "ExportedKafkaRecord::from(&record)" +input_file: crates/lib/tests/inputs/record-1.json +--- +ExportedKafkaRecord { + record: KafkaRecord { + value: Json( + Object { + "type": String("Feature"), + "geometry": Object { + "type": String("Point"), + "coordinates": Array [ + Number(4.0215), + Number(49.211613), + ], + }, + "properties": Object { + "label": String("Rue Franz Kafka 51100 Reims"), + "score": Number(0.6992354545454544), + "id": String("51454_3782"), + "name": String("Rue Franz Kafka"), + "postcode": String("51100"), + "citycode": String("51454"), + "x": Number(774438.44), + "y": Number(6901804.03), + "city": String("Reims"), + "context": String("51, Marne, Grand Est"), + "type": String("street"), + "importance": Number(0.69159), + "street": String("Rue Franz Kafka"), + }, + }, + ), + value_as_string: "", + key: Json( + String("381"), + ), + key_as_string: "", + topic: "public-french-addresses", + timestamp: Some( + 1732481153241, + ), + partition: 0, + offset: 4, + headers: {}, + key_schema: Some( + Schema { + id: SchemaId( + 1, + ), + schema_type: None, + }, + ), + value_schema: Some( + Schema { + id: SchemaId( + 2, + ), + schema_type: None, + }, + ), + size: 381, + }, + date_time: Some( + 2024-11-24T21:45:53.241+01:00, + ), + absolute_delta_in_ms: 0, + relative_delta_in_ms: 0, + search_query: "", +} diff --git a/crates/lib/tests/snapshots/r#mod__inputs@1.sql.snap b/crates/lib/tests/snapshots/r#mod__inputs@1.sql.snap new file mode 100644 index 0000000..5f7db54 --- /dev/null +++ b/crates/lib/tests/snapshots/r#mod__inputs@1.sql.snap @@ -0,0 +1,72 @@ +--- +source: crates/lib/tests/mod.rs +expression: parse_search_query(input) +input_file: crates/lib/tests/inputs/1.sql +--- +Ok( + ( + "", + SearchQuery { + expression: OrExpression( + [ + AndTerm( + Atom( + Filter( + Filter { + name: "my-wasm-filter", + parameters: [ + String( + "cool", + ), + String( + "cat", + ), + ], + }, + ), + ), + ), + AndTerm( + Atom( + Parenthesis( + OrTerm( + AndExpression( + [ + Atom( + Compare( + TimestampBetween( + 2024-05-28T17:55:08.145+02:00, + [datetime], + ), + ), + ), + Atom( + Compare( + Value( + None, + Contain, + "foundation", + ), + ), + ), + ], + ), + ), + ), + ), + ), + ], + ), + limit: Some( + 1000, + ), + from: Some( + Beginning, + ), + order_by: OrderBy { + order: Key, + keyword: Desc, + }, + }, + ), +) diff --git a/crates/lib/tests/snapshots/r#mod__inputs@2.sql.snap b/crates/lib/tests/snapshots/r#mod__inputs@2.sql.snap new file mode 100644 index 0000000..1b8973b --- /dev/null +++ b/crates/lib/tests/snapshots/r#mod__inputs@2.sql.snap @@ -0,0 +1,77 @@ +--- +source: crates/lib/tests/mod.rs +expression: parse_search_query(input) +input_file: crates/lib/tests/inputs/2.sql +--- +Ok( + ( + "", + SearchQuery { + expression: OrExpression( + [ + AndExpression( + [ + Atom( + Compare( + Value( + None, + Contain, + "uv", + ), + ), + ), + Atom( + Compare( + Key( + Contain, + "foobar", + ), + ), + ), + ], + ), + AndExpression( + [ + Atom( + Compare( + Topic( + Equal, + "french-recipes", + ), + ), + ), + Not( + Parenthesis( + OrTerm( + AndTerm( + Atom( + Compare( + Partition( + NotEqual, + 1, + ), + ), + ), + ), + ), + ), + ), + ], + ), + ], + ), + limit: Some( + 100, + ), + from: Some( + OffsetTail( + 5000, + ), + ), + order_by: OrderBy { + order: Timestamp, + keyword: Asc, + }, + }, + ), +) diff --git a/crates/lib/tests/snapshots/r#mod__inputs@3.sql.snap b/crates/lib/tests/snapshots/r#mod__inputs@3.sql.snap new file mode 100644 index 0000000..49a2d1f --- /dev/null +++ b/crates/lib/tests/snapshots/r#mod__inputs@3.sql.snap @@ -0,0 +1,146 @@ +--- +source: crates/lib/tests/mod.rs +expression: parse_search_query(input) +input_file: crates/lib/tests/inputs/3.sql +--- +Ok( + ( + "", + SearchQuery { + expression: OrExpression( + [ + AndExpression( + [ + Atom( + Parenthesis( + OrExpression( + [ + AndTerm( + Atom( + Parenthesis( + OrTerm( + AndExpression( + [ + Atom( + Compare( + Topic( + Equal, + "system", + ), + ), + ), + Atom( + Compare( + Key( + Contain, + "restart", + ), + ), + ), + ], + ), + ), + ), + ), + ), + AndTerm( + Not( + Parenthesis( + OrTerm( + AndExpression( + [ + Atom( + Compare( + Value( + None, + StartWith, + "ignored", + ), + ), + ), + Atom( + Compare( + Partition( + GreaterThan, + 2, + ), + ), + ), + ], + ), + ), + ), + ), + ), + ], + ), + ), + ), + Atom( + Parenthesis( + OrTerm( + AndTerm( + Atom( + Compare( + TimestampBetween( + [datetime], + [datetime], + ), + ), + ), + ), + ), + ), + ), + Atom( + Filter( + Filter { + name: "myFilter", + parameters: [ + String( + "check", + ), + String( + "error", + ), + Number( + 500, + ), + ], + }, + ), + ), + ], + ), + AndTerm( + Atom( + Filter( + Filter { + name: "myFilter", + parameters: [ + String( + "type", + ), + Number( + 100, + ), + ], + }, + ), + ), + ), + ], + ), + limit: Some( + 50, + ), + from: Some( + Beginning, + ), + order_by: OrderBy { + order: Key, + keyword: Desc, + }, + }, + ), +) diff --git a/crates/tui/Cargo.toml b/crates/tui/Cargo.toml new file mode 100644 index 0000000..49084b2 --- /dev/null +++ b/crates/tui/Cargo.toml @@ -0,0 +1,43 @@ +[package] +name = "yozefu-tui" +description = "A TUI for browsing kafka topics" +keywords = ["kafka", "consumer", "replace", "regex"] +readme = "README.md" +categories = [ + "command-line-utilities", + "gui", + "development-tools", + "visualization", +] +version.workspace = true +authors.workspace = true +edition.workspace = true +homepage.workspace = true +license.workspace = true +repository.workspace = true + + +[dependencies] +tokio = { version = "1", features = ["full", "tracing"] } +serde = { version = "1.0.215", features = ["derive"] } +serde_json = { version = "1.0.133", features = ["preserve_order"] } +log = "0.4.22" +tui-input = "0.11.1" +chrono = "0.4.39" +strum = { version = "0.26.3", features = ["derive", "strum_macros"] } +ratatui = { version = "0.29.0", features = ["serde", "unstable-rendered-line-info"] } +crossterm = { version = "0.28.1", features = ["event-stream"] } +itertools = "0.13.0" +bytesize = { version = "1.3.0" } +nom = "8.0.0-beta.1" +throbber-widgets-tui = "0.8.0" +futures = "0.3.31" +open = "5.3.1" +tokio-util = "0.7.13" +thousands = "0.2.0" +circular-buffer = "0.1.9" +copypasta = "0.10.1" +rayon = "1.10.0" +lib = { workspace = true } +app = { workspace = true } +rdkafka = { version = "0.37.0", features = ["cmake-build"] } diff --git a/crates/tui/LICENSE b/crates/tui/LICENSE new file mode 100644 index 0000000..21d1827 --- /dev/null +++ b/crates/tui/LICENSE @@ -0,0 +1,13 @@ +Copyright [2024] yozefu-tui + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. \ No newline at end of file diff --git a/crates/tui/README.md b/crates/tui/README.md new file mode 100644 index 0000000..7353d96 --- /dev/null +++ b/crates/tui/README.md @@ -0,0 +1,14 @@ +# yozefu-tui + +[![Build](https://github.com/MAIF/yozefu/actions/workflows/build.yml/badge.svg)](https://github.com/MAIF/yozefu/actions/workflows/build.yml) +[![](https://img.shields.io/crates/v/yozefu-tui.svg)](https://crates.io/crates/yozefu-tui) + + +This crate contains the glue code for [Ratatui](https://github.com/ratatui/ratatui) and the [`yozefu-app`](https://github.com/MAIF/yozefu) crate. + + +## Usage + +```bash +cargo add yozefu-tui +``` \ No newline at end of file diff --git a/crates/tui/src/action.rs b/crates/tui/src/action.rs new file mode 100644 index 0000000..6e2b079 --- /dev/null +++ b/crates/tui/src/action.rs @@ -0,0 +1,83 @@ +use app::{search::ValidSearchQuery, Config}; +use std::collections::HashSet; + +use lib::{search::OrderBy, KafkaRecord, TopicDetail}; + +use super::component::{ComponentName, Shortcut}; + +/// Actions that can be dispatched to the UI +#[allow(clippy::large_enum_variant)] +#[derive(Debug, Clone, PartialEq)] +pub enum Action { + Tick, + Render, + /// Notify the UI that the terminal has been resized + Resize(u16, u16), + /// Notify the UI that the app is about to quit + Quit, + /// Request the app to export the given record into the file + Export(KafkaRecord), + /// Dispatch statistics about the number of processed records + Count((usize, usize, usize)), + /// Dispatch the new shortcuts to the UI + Shortcuts(Vec, bool), + /// Request the app to clear the current notification + ResetNotification(), + /// Request the UI to show a new notification + Notification(Notification), + /// Request the UI to start searching for kafka records + Search(ValidSearchQuery), + /// notification to the UI + ShowRecord(KafkaRecord), + /// Request the app to setup a new kafka consumer + NewConsumer(), + /// Request the app to start consuming + Consuming, + /// Request the app to refresh the UI + Refresh, + /// Request to refresh the shortcuts in the footer component + RefreshShortcuts, + /// Request to close the kafka consumer + StopConsuming(), + /// Request the app to fetch details (consumer groups, members...) of the given topics + RequestTopicDetails(HashSet), + /// Notify the UI the list of topics + Topics(Vec), + /// Notify the UI that a new record has been polled + NewRecord(KafkaRecord), + /// Request the list of kafka records to be sorted in a specific way + OrderBy(OrderBy), + /// List of topics to consume + SelectedTopics(Vec), + /// Dispatch the new configuration to the UI + NewConfig(Config), + /// Copy the given record to the clipboard + CopyToClipboard(KafkaRecord), + /// Notify the UI that a new component has been be displayed + NewView(ComponentName), + /// Notify the UI the visible components and their order in the stack view + ViewStack((ComponentName, Vec)), + /// Request to open the web browser with the URL template (AKHQ, redpanda-console, etc.) pointing to the given record + Open(KafkaRecord), + /// Request the UI to close the specified component + Close(ComponentName), + /// Notify the UI some details (consumer groups, members...) of a given topic + TopicDetails(Vec), + /// Notify the UI that the user typed a new search query + NewSearchPrompt(String), + /// Notify the progress bar an estimate of the kafka records to consume in total according to the search query + RecordsToRead(usize), +} + +/// A notification is a message displayed at the bottom-right corner of the TUI. +#[derive(Debug, Clone, PartialEq)] +pub struct Notification { + pub level: log::Level, + pub message: String, +} + +impl Notification { + pub fn new(level: log::Level, message: String) -> Self { + Self { level, message } + } +} diff --git a/crates/tui/src/component/footer_component.rs b/crates/tui/src/component/footer_component.rs new file mode 100644 index 0000000..dd5ea37 --- /dev/null +++ b/crates/tui/src/component/footer_component.rs @@ -0,0 +1,162 @@ +//! The footer component displays contextual information: the current cluster, shortcuts and the last notifications +use crossterm::event::KeyEvent; + +use ratatui::{ + layout::Rect, + style::{Color, Style, Stylize}, + text::{Line, Span}, + Frame, +}; +use tokio::sync::mpsc::UnboundedSender; + +use crate::{ + action::{Action, Notification}, + error::TuiError, +}; + +use super::{Component, ComponentName, Shortcut, State}; + +#[derive(Default)] +pub struct FooterComponent { + pub shortcuts: Vec, + pub main_component: ComponentName, + pub state: Vec, + pub notification: Option, + pub action_tx: Option>, + pub ticks: u64, + pub show_shortcuts: bool, +} + +impl FooterComponent { + fn generate_shortcuts(&self, _state: &State) -> Vec> { + let mut spans = vec![]; + for shortcut in &self.shortcuts { + spans.push(format!("[{}]", shortcut.key).bold()); + spans.push(format!(":{} ", shortcut.description).into()); + } + + spans + } + + pub fn show_shortcuts(&mut self, visible: bool) -> &Self { + self.show_shortcuts = visible; + self + } +} + +impl Component for FooterComponent { + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.action_tx = Some(tx); + Ok(()) + } + + fn id(&self) -> ComponentName { + ComponentName::Footer + } + + fn handle_key_events(&mut self, _key: KeyEvent) -> Result, TuiError> { + Ok(None) + } + + fn update(&mut self, action: Action) -> Result, TuiError> { + match action { + Action::Shortcuts(s, show) => { + self.shortcuts = s; + if show { + match self.main_component { + ComponentName::TopicsAndRecords => self + .shortcuts + .push(Shortcut::new("CTRL + O", "Hide topics")), + ComponentName::Records => self + .shortcuts + .push(Shortcut::new("CTRL + O", "Show topics")), + _ => (), + }; + } + self.shortcuts.push(Shortcut::new("CTRL + H", "Help")); + self.shortcuts.push(Shortcut::new("TAB", "Next panel")); + self.shortcuts.push(Shortcut::new("ESC", "Quit")); + } + Action::ViewStack((main_component, views)) => { + self.main_component = main_component; + self.state = views; + } + Action::Notification(notification) => { + self.ticks = 0; + self.notification = Some(notification) + } + Action::ResetNotification() => self.notification = None, + Action::Tick => { + self.ticks += 1; + if self.ticks > 30 { + self.notification = None; + } + } + _ => (), + } + Ok(None) + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + let mut view_stack = self.state.clone(); + view_stack.dedup(); + view_stack.push(state.focused.clone()); + let mut help = vec![]; + help.push( + format!(" {} ", state.cluster) + .black() + .bold() + .bg(state.theme.white), + ); + help.push(" ".into()); + for v in view_stack.iter().enumerate() { + let colors = match v.0 == view_stack.len() - 1 { + true => (state.theme.bg_active, state.theme.fg_active), + false => (state.theme.bg_disabled, state.theme.fg_disabled), + }; + if v.0 > 0 { + help.push("".to_string().bg(colors.0).fg(state.theme.bg)); + } + let prefix = match v.0 { + 0 if self.main_component == ComponentName::TopicsAndRecords => "◧ ", + 0 if self.main_component == ComponentName::Records => "□ ", + _ => "", + }; + + help.push( + format!(" {}{:<8}", prefix, v.1.label()) + .bg(colors.0) + .fg(colors.1) + .bold(), + ); + help.push("".fg(colors.0)); + } + + help.push(Span::from(" ")); + if self.show_shortcuts { + help.extend(self.generate_shortcuts(state)); + } + + let line = Line::from(help); + f.render_widget(line, rect); + + if let Some(n) = &self.notification { + let notification = + Span::styled(n.message.to_string(), Style::default().italic().not_bold()); + let r = Rect::new( + rect.width + .saturating_sub(u16::try_from(n.message.len().checked_sub(3).unwrap_or(3))?), + rect.y, + n.message.len() as u16, + 1, + ); + let notification = match n.level { + log::Level::Error => notification.fg(Color::LightRed).underlined(), + log::Level::Warn => notification.fg(Color::Yellow), + _ => notification, + }; + f.render_widget(notification, r); + } + Ok(()) + } +} diff --git a/crates/tui/src/component/help_component.rs b/crates/tui/src/component/help_component.rs new file mode 100644 index 0000000..4d8a6fe --- /dev/null +++ b/crates/tui/src/component/help_component.rs @@ -0,0 +1,184 @@ +//! Component showing the help + +use crossterm::event::{KeyCode, KeyEvent}; +use itertools::Itertools; +use ratatui::{ + layout::{Margin, Rect}, + style::Stylize, + text::{Line, Span}, + widgets::{ + Block, BorderType, Borders, Clear, Padding, Paragraph, Scrollbar, ScrollbarOrientation, + ScrollbarState, Wrap, + }, + Frame, +}; + +use crate::{error::TuiError, Action}; + +use super::{issue_component::IssueComponent, Component, ComponentName, Shortcut, State}; + +const HELP_HEIGHT: u16 = 42; +const TEN_MINUTES_FRAME: usize = 30 * 60 * 10; +const REPOSITORY_URL: &str = concat!( + " https://github.com/MAIF/yozefu/tree/v", + env!("CARGO_PKG_VERSION") +); + +#[derive(Default)] +pub struct HelpComponent { + pub scroll: u16, + pub scroll_length: u16, + pub scrollbar_state: ScrollbarState, + pub rendered: usize, +} + +impl Component for HelpComponent { + fn id(&self) -> ComponentName { + ComponentName::Help + } + + fn shortcuts(&self) -> Vec { + vec![] + } + + fn handle_key_events(&mut self, key: KeyEvent) -> Result, TuiError> { + self.rendered = 0; + match key.code { + KeyCode::Char('k') | KeyCode::Down => { + self.scroll = (self.scroll + 1).min(self.scroll_length); + } + KeyCode::Char('j') | KeyCode::Up => { + self.scroll = self.scroll.saturating_sub(1); + } + KeyCode::Char('[') => { + self.scroll = 0; + } + KeyCode::Char(']') => { + self.scroll = self.scroll_length; + } + _ => (), + } + Ok(None) + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + f.render_widget(Clear, rect); + + let block = Block::default() + .borders(Borders::ALL) + .padding(Padding::horizontal(2)) + .border_type(BorderType::Rounded) + .title(" Help "); + + let block = self.make_block_focused_with_state(state, block); + + let text = vec![ + Line::from(""), + Line::from(""), + Line::from(" Key Description").bold(), + Line::from(" / Focus search input"), + Line::from(" ESC Close the window/app"), + Line::from(" TAB Focus next window"), + Line::from(" SHIFT + TAB Focus previous window"), + Line::from(""), + + Line::from(" Variable Type Alias Description").bold(), + Line::from(" topic String t Kafka topic"), + Line::from(" offset Number o Offset of the record"), + Line::from(" key k Key of the record"), + Line::from(" value v Value of the record"), + Line::from(" partition Number p Partition of the record"), + Line::from(" timestamp String ts Timestamp of the record"), + Line::from(" size String si Size of the record"), + Line::from(" headers Map h Headers of the record"), + Line::from(""), + + Line::from(" Operator Type Description").bold(), + Line::from(" == | != | > | >= | < | <= Number | String Wayne's world, party time! Excellent!"), + Line::from(" contains | ~= String Test if the variable contains the specified string"), + Line::from(" starts with String Test if the variable starts with the specified string"), + Line::from(""), + + + Line::from(" Clause Syntax Description").bold(), + Line::from(" limit limit Limit the number of kafka records to receive"), + Line::from(" from from Start consuming records from the beginning, the end or a date"), + Line::from(" order by order by Sort kafka records"), + Line::from(""), + + Line::from(" Input Description").bold(), + Line::from(r#" timestamp >= "1 hours ago" All records published within the last hour"#), + Line::from(r#"v contains "rust" and partition == 2 from beginning limit 1000 The first 1_000 kafka records from partition 2 containing 'rust' in the value"#), + Line::from(r#" (key == "ABC") || (key ~= "XYZ") from end - 5000 Among the latest 5_000 records, return the records where the key is "ABC" or the key contains "XYZ""#), + Line::from(r#" value.hello == "world" order by key desc Any kafka JSON record with a JSON property "hello" with the value "world", sorted by key in descending order"#), + Line::from(""), + Line::from(vec![ + Span::from(" Theme").bold(), + Span::from(format!( + " Theme is '{}'. You can switch between [{}] in the config file or with the '--theme' flag", + state.theme.name, + state.themes.iter().filter(|f| *f != &state.theme.name).join(", ") + )) + ]), + Line::from(vec![ + Span::from(" Configuration").bold(), + Span::from(format!(" '{}'", state.configuration_file.display())) + ]), + Line::from(vec![ + Span::from(" Logs").bold(), + Span::from(format!(" '{}'", state.logs_file.display())) + ]), + Line::from(vec![ + Span::from(" Filters").bold(), + Span::from(format!(" '{}'", state.filters_dir.display())) + ]), + Line::from(vec![ + Span::from(" Themes").bold(), + Span::from(format!(" '{}'", state.themes_file.display())) + ]), + Line::from(vec![ + Span::from(" Version").bold(), + Span::from(REPOSITORY_URL) + ]), + Line::from(""), + ]; + + self.scrollbar_state = self.scrollbar_state.content_length(0); + if rect.height < HELP_HEIGHT { + self.scroll_length = HELP_HEIGHT - rect.height + 2; + self.scrollbar_state = self + .scrollbar_state + .content_length(self.scroll_length as usize) + .position(self.scroll as usize); + } else { + self.scrollbar_state = self.scrollbar_state.content_length(0); + self.scroll = 0; + } + + let paragraph = Paragraph::new(text) + .wrap(Wrap { trim: false }) + .scroll((self.scroll, 0)); + f.render_widget(paragraph.block(block), rect); + + let scrollbar = Scrollbar::new(ScrollbarOrientation::VerticalRight) + .begin_symbol(Some("▲")) + .end_symbol(Some("▼")); + + if self.rendered > TEN_MINUTES_FRAME { + let mut issue = IssueComponent::default(); + issue.draw(f, rect, state)?; + } + + f.render_stateful_widget( + scrollbar, + rect.inner(Margin { + vertical: 1, + horizontal: 0, + }), + &mut self.scrollbar_state, + ); + self.rendered += 1; + + Ok(()) + } +} diff --git a/crates/tui/src/component/issue_component.rs b/crates/tui/src/component/issue_component.rs new file mode 100644 index 0000000..445802b --- /dev/null +++ b/crates/tui/src/component/issue_component.rs @@ -0,0 +1,59 @@ +//! A notification component + +use ratatui::{ + layout::{Constraint, Flex, Layout, Rect}, + style::{Style, Stylize}, + text::Line, + widgets::{Block, BorderType, Borders, Clear, Padding, Paragraph}, + Frame, +}; + +use crate::error::TuiError; + +use super::{Component, ComponentName, State}; + +#[derive(Default)] +pub struct IssueComponent {} + +impl Component for IssueComponent { + fn id(&self) -> ComponentName { + ComponentName::Help + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + let block = Block::default() + .borders(Borders::ALL) + .padding(Padding::symmetric(5, 0)) + .style(Style::default()) + .border_type(BorderType::Rounded); + + let block = self.make_block_focused(state, block); + let pp = Paragraph::new(vec![ + Line::from(""), + Line::from("ʕノ•ᴥ•ʔノ ︵ ┻━┻"), + Line::from("Alt: Your mate flipping the desk").italic(), + Line::from(""), + Line::from("Are you struggling with the tool?"), + Line::from("Leave us an issue so we can improve it:"), + Line::from("https://github.com/MAIF/yozefu/issues").bold(), + Line::from(""), + Line::from("Press any key to close.").italic(), + ]); + + let mut rect = rect; + rect.width -= 2; + rect.y += 1; + + let [area] = Layout::horizontal([Constraint::Length(54)]) + .flex(Flex::End) + .areas(rect); + let [area] = Layout::vertical([Constraint::Length(11)]) + .flex(Flex::Start) + .areas(area); + + f.render_widget(Clear, area); + f.render_widget(pp.block(block), area); + + Ok(()) + } +} diff --git a/crates/tui/src/component/mod.rs b/crates/tui/src/component/mod.rs new file mode 100644 index 0000000..6472a2d --- /dev/null +++ b/crates/tui/src/component/mod.rs @@ -0,0 +1,139 @@ +mod footer_component; +mod help_component; +mod issue_component; +mod progress_bar_component; +mod record_details_component; +mod records_component; +mod root_component; +mod search_component; +mod shortcut; +mod state; +mod topic_details_component; +mod topics_and_records_component; +mod topics_component; +pub mod ui; +mod vertical_scrollable_block; + +use crossterm::event::{KeyEvent, MouseEvent}; +use ratatui::{ + layout::Rect, + style::{Style, Stylize}, + widgets::{Block, BorderType}, + Frame, +}; +pub use shortcut::Shortcut; +use strum::Display; +use tokio::sync::mpsc::UnboundedSender; +pub use ui::Ui; + +use std::sync::{Arc, LazyLock, Mutex}; + +pub use root_component::RootComponent; +pub use state::State; + +use serde::Deserialize; + +use crate::{records_buffer::RecordsBuffer, tui::Event, Action, TuiError}; + +pub type ConcurrentRecordsBuffer = LazyLock>>; +static BUFFER: ConcurrentRecordsBuffer = + LazyLock::new(|| Arc::new(Mutex::new(RecordsBuffer::new()))); + +#[derive(Debug, Clone, PartialEq, Eq, Deserialize)] +pub enum FocusDirection { + Top, + Left, + Right, + Bottom, +} + +#[derive(Debug, Clone, Display, Hash, PartialEq, Eq, Deserialize, PartialOrd, Ord)] +pub enum ComponentName { + Records, + Topics, + Footer, + RecordDetails, + TopicsAndRecords, + RecordsView, + TopicDetails, + Main, + Search, + Dialog, + Help, +} + +impl ComponentName { + pub fn label(&self) -> String { + match &self { + ComponentName::RecordDetails => "Record".to_string(), + ComponentName::TopicDetails => "Topic".to_string(), + _ => self.to_string(), + } + } +} + +impl Default for ComponentName { + fn default() -> Self { + Self::Topics + } +} + +pub trait WithHeight: Component { + fn content_height(&self) -> usize { + 0 + } +} + +pub trait Component { + #[allow(unused_variables)] + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + Ok(()) + } + + fn id(&self) -> ComponentName; + + fn make_block_focused_with_state<'a>(&self, state: &State, block: Block<'a>) -> Block<'a> { + match state.focused == self.id() { + true => self.make_block_focused(state, block), + false => block, + } + } + + fn make_block_focused<'a>(&self, state: &State, block: Block<'a>) -> Block<'a> { + block + .border_style(Style::default().fg(state.theme.focused_border)) + .border_type(BorderType::Thick) + .title_style(Style::default().bold()) + } + + fn init(&mut self) -> Result<(), TuiError> { + Ok(()) + } + + fn handle_events(&mut self, event: Option) -> Result, TuiError> { + let r = match event { + Some(Event::Key(key_event)) => self.handle_key_events(key_event)?, + Some(Event::Mouse(mouse_event)) => self.handle_mouse_events(mouse_event)?, + _ => None, + }; + Ok(r) + } + + fn handle_key_events(&mut self, _key: KeyEvent) -> Result, TuiError> { + Ok(None) + } + + fn handle_mouse_events(&mut self, _mouse: MouseEvent) -> Result, TuiError> { + Ok(None) + } + + fn update(&mut self, _action: Action) -> Result, TuiError> { + Ok(None) + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError>; + + fn shortcuts(&self) -> Vec { + vec![] + } +} diff --git a/crates/tui/src/component/progress_bar_component.rs b/crates/tui/src/component/progress_bar_component.rs new file mode 100644 index 0000000..004ad97 --- /dev/null +++ b/crates/tui/src/component/progress_bar_component.rs @@ -0,0 +1,46 @@ +//! Progress bar at the top of the window when you consume kafka records. +use ratatui::{ + buffer::Buffer, + layout::Rect, + style::{Color, Style}, + widgets::Widget, +}; + +#[derive(Clone, Default)] +pub struct ProgressBarComponent { + length: u64, + progress: u64, +} + +impl ProgressBarComponent { + pub fn new(length: u64) -> Self { + Self { + length, + progress: Default::default(), + } + } + + pub fn set_progress(&mut self, inc: usize) { + self.progress = inc as u64; + } + + pub fn set_length(&mut self, length: usize) { + self.length = length as u64; + } +} + +impl Widget for ProgressBarComponent { + fn render(self, area: Rect, buf: &mut Buffer) { + if self.progress == 0 || self.length == 0 { + return; + } + let percent = 100 * self.progress / self.length; + let dd = (area.right() - area.left()) as u64 * percent / 100; + buf.set_string( + area.left(), + area.top(), + (0..dd).map(|_| "▔").collect::(), + Style::default().fg(Color::Green), + ); + } +} diff --git a/crates/tui/src/component/record_details_component.rs b/crates/tui/src/component/record_details_component.rs new file mode 100644 index 0000000..9d244ab --- /dev/null +++ b/crates/tui/src/component/record_details_component.rs @@ -0,0 +1,301 @@ +//! Component showing all the details of a given kafka record. +use bytesize::ByteSize; +use crossterm::event::{KeyCode, KeyEvent}; + +use itertools::Itertools; +use lib::KafkaRecord; +use ratatui::{ + layout::{Margin, Rect}, + style::{Style, Stylize}, + text::{Line, Span, Text}, + widgets::{ + Block, Borders, Clear, Padding, Paragraph, Scrollbar, ScrollbarOrientation, ScrollbarState, + Wrap, + }, + Frame, +}; +use tokio::sync::mpsc::UnboundedSender; + +use crate::{error::TuiError, Action}; + +use super::{Component, ComponentName, Shortcut, State}; + +#[derive(Default)] +pub struct RecordDetailsComponent<'a> { + pub record: Option, + pub scroll: usize, + pub lines: usize, + pub text: Paragraph<'a>, + pub rect: Rect, + pub scroll_size: u16, + pub scrollbar_state: ScrollbarState, + pub action_tx: Option>, +} + +impl RecordDetailsComponent<'_> { + pub fn new(_state: &State) -> Self { + Self { + ..Default::default() + } + } + + fn generate_span(key: &str, value: String) -> Line<'_> { + Line::from(vec![ + Span::styled( + format!("{:>12}: ", key.to_string()), + Style::default().bold(), + ), + Span::styled(value.to_string(), Style::default()), + ]) + } + + pub fn set_scroll_bar_state(&mut self, lines: usize) { + self.lines = lines; + if self.lines < self.rect.height as usize { + self.scroll = 0; + self.scroll_size = 0; + } else { + self.scroll = 0; + self.scroll_size = self + .lines + .saturating_sub(self.rect.height.saturating_sub(10) as usize) + as u16; + } + self.scrollbar_state = ScrollbarState::new(self.scroll_size.into()).position(self.scroll); + } + + fn compute_record_rendering(&mut self) { + if self.record.is_none() { + self.record = Some(KafkaRecord::default()); + } + + let record = self.record.as_ref().unwrap(); + let mut to_render = vec![ + Line::default(), + Self::generate_span("Topic", record.topic.clone()), + Self::generate_span( + "Timestamp", + self.record + .as_ref() + .unwrap() + .timestamp + .unwrap_or(0) + .to_string(), + ), + Self::generate_span( + "DateTime", + self.record + .as_ref() + .unwrap() + .timestamp_as_local_date_time() + .map(|e| e.to_rfc3339_opts(chrono::SecondsFormat::Millis, false)) + .unwrap_or("".to_string()), + ), + Self::generate_span("Offset", record.offset.to_string()), + Self::generate_span("Partition", record.partition.to_string()), + Self::generate_span("Size", ByteSize(record.size as u64).to_string()), + Self::generate_span("Headers", "".to_string()), + ]; + + if let Some(s) = &record.key_schema { + match &s.schema_type { + Some(t) => to_render.push(Self::generate_span( + "Key schema", + format!("{} - {}", s.id, t), + )), + None => to_render.push(Self::generate_span("Key schema", s.id.to_string())), + } + } + if let Some(s) = &record.value_schema { + match &s.schema_type { + Some(t) => to_render.push(Self::generate_span( + "Value schema", + format!("{} - {}", s.id, t), + )), + None => to_render.push(Self::generate_span("Value schema", s.id.to_string())), + } + } + + let longest_header_key = self + .record + .as_ref() + .unwrap() + .headers + .keys() + .map(|e| e.len()) + .max() + .unwrap_or(0); + + let mut formatted_headers = vec![]; + for entry in self + .record + .as_ref() + .unwrap() + .headers + .iter() + .sorted_by(|a, b| a.0.cmp(b.0)) + .enumerate() + { + let e = entry.1; + match entry.0 { + 0 => formatted_headers.push(Span::styled( + format!("{: formatted_headers.push(Span::styled( + format!(" {: = + // h.highlight_line(line, &SYNTAX_SET).unwrap(); + // let escaped = as_24_bit_terminal_escaped(&ranges[..], false); + // let output = escaped.into_text().unwrap(); + // payload_lines.extend(output.lines); + //} + let text = Text::from(record.value.to_string_pretty()); + to_render.extend(text.lines); + + let p = Paragraph::new(to_render) + .wrap(Wrap { trim: true }) + .scroll((self.scroll as u16, 0)); + self.set_scroll_bar_state(p.line_count(self.rect.width) + 4); + self.text = p; + } +} + +impl Component for RecordDetailsComponent<'_> { + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.action_tx = Some(tx); + Ok(()) + } + + fn id(&self) -> ComponentName { + ComponentName::RecordDetails + } + + fn handle_key_events(&mut self, key: KeyEvent) -> Result, TuiError> { + match key.code { + KeyCode::Char('k') => { + self.scroll = (self.scroll + 1) + .min(self.lines) + .min(self.scroll_size as usize); + self.scrollbar_state = self.scrollbar_state.position(self.scroll); + } + KeyCode::Char('j') if self.scroll > 0 => { + self.scroll -= 1; + self.scrollbar_state = self.scrollbar_state.position(self.scroll); + } + KeyCode::Char('[') => { + self.scroll = 0; + } + KeyCode::Char(']') => { + self.scroll = self.scroll_size.into(); + } + KeyCode::Char('o') => { + if let Some(record) = &self.record { + self.action_tx + .as_ref() + .unwrap() + .send(Action::Open(record.clone()))?; + } + } + KeyCode::Char('c') => { + if let Some(record) = &self.record { + self.action_tx + .as_ref() + .unwrap() + .send(Action::CopyToClipboard(record.clone()))?; + } + } + KeyCode::Char('e') => { + if let Some(record) = &self.record { + self.action_tx + .as_ref() + .unwrap() + .send(Action::Export(record.clone()))?; + } + } + _ => (), + } + self.scrollbar_state = self.scrollbar_state.position(self.scroll); + Ok(None) + } + + fn update(&mut self, action: Action) -> Result, TuiError> { + if let Action::ShowRecord(record) = action { + self.record = Some(record); + self.compute_record_rendering(); + }; + Ok(None) + } + + fn shortcuts(&self) -> Vec { + vec![ + Shortcut::new("J/K", "Scroll"), + Shortcut::new("↑↓", "Previous/next record"), + ] + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + if self.rect != rect { + self.rect = rect; + self.compute_record_rendering(); + } + let p = self + .text + .clone() + .wrap(Wrap { trim: false }) + .scroll((self.scroll as u16, 0)); + + let scrollbar = Scrollbar::new(ScrollbarOrientation::VerticalRight) + .begin_symbol(Some("▲")) + .end_symbol(Some("▼")); + + f.render_widget(Clear, rect); + let block = Block::new() + .borders(Borders::ALL) + .padding(Padding::symmetric(4, 0)) + .title(" Record details "); + let block = self.make_block_focused_with_state(state, block); + + f.render_widget(p.block(block), rect); + + f.render_stateful_widget( + scrollbar, + rect.inner(Margin { + vertical: 1, + horizontal: 0, + }), + &mut self.scrollbar_state, + ); + Ok(()) + } +} diff --git a/crates/tui/src/component/records_component.rs b/crates/tui/src/component/records_component.rs new file mode 100644 index 0000000..ff6ebdc --- /dev/null +++ b/crates/tui/src/component/records_component.rs @@ -0,0 +1,458 @@ +//! Component showing in real time incoming kafka records. +use app::search::ValidSearchQuery; +use copypasta::{ClipboardContext, ClipboardProvider}; +use crossterm::event::{KeyCode, KeyEvent, KeyModifiers}; +use itertools::Itertools; +use lib::ExportedKafkaRecord; +use ratatui::{ + layout::{Alignment, Constraint, Rect}, + style::{Style, Stylize}, + text::{Span, Text}, + widgets::{Block, BorderType, Borders, Cell, Row, Table, TableState}, + Frame, +}; +use thousands::Separable; +use throbber_widgets_tui::ThrobberState; +use tokio::sync::mpsc::UnboundedSender; +use tokio::sync::watch::Receiver; + +use crate::{action::Notification, error::TuiError, records_buffer::BufferAction, Action}; + +use super::{Component, ComponentName, ConcurrentRecordsBuffer, Shortcut, State}; + +pub struct RecordsComponent<'a> { + pub records: &'a ConcurrentRecordsBuffer, + pub state: TableState, + status: ThrobberState, + pub search_query: ValidSearchQuery, + pub consuming: bool, + pub count: (usize, usize, usize), + pub follow: bool, + pub action_tx: Option>, + pub buffer_tx: Receiver, + pub selected_topics: usize, + pub key_events_buffer: Vec, +} + +impl<'a> RecordsComponent<'a> { + pub fn new(records: &'a ConcurrentRecordsBuffer) -> Self { + let buffer_tx = records.lock().map(|e| e.channels.clone().1).ok().unwrap(); + + Self { + records, + state: Default::default(), + status: Default::default(), + search_query: Default::default(), + consuming: Default::default(), + count: Default::default(), + follow: Default::default(), + action_tx: Default::default(), + buffer_tx, + selected_topics: Default::default(), + key_events_buffer: Default::default(), + } + } + + fn buffer_is_empty(&self) -> bool { + self.count.2 == 0 + } + + fn buffer_len(&self) -> usize { + self.count.2 + } + + fn next(&mut self) { + if self.buffer_is_empty() { + self.state.select(None); + return; + } + let i = match self.state.selected() { + Some(i) => { + if i >= self.buffer_len() - 1 { + i + } else { + i + 1 + } + } + None => 0, + }; + self.state.select(Some(i)); + } + + fn shorten_topic(topic: &str) -> String { + let t = topic.replace('_', ""); + let parts = t.split('.'); + let e = parts.map(|e| e.chars().next().unwrap_or('_')).join("."); + e + } + + fn show_details(&mut self) -> Result<(), TuiError> { + if self.state.selected().is_some() { + self.action_tx + .as_ref() + .unwrap() + .send(Action::NewView(ComponentName::RecordDetails))?; + self.set_event_dialog()? + } + Ok(()) + } + + fn set_event_dialog(&mut self) -> Result<(), TuiError> { + if let Some(s) = self.state.selected() { + let record = self.records.lock().unwrap().get(s).unwrap().clone(); + self.action_tx + .as_ref() + .unwrap() + .send(Action::ShowRecord(record))?; + } + Ok(()) + } + + fn previous(&mut self) { + if self.buffer_is_empty() { + self.state.select(None); + return; + } + let i = match self.state.selected() { + Some(i) => { + if i == 0 { + 0 + } else { + i - 1 + } + } + None => 0, + }; + self.state.select(Some(i)); + } + + fn first(&mut self) { + match self.buffer_is_empty() { + true => self.state.select(None), + false => self.state.select(Some(0)), + } + } + + fn last(&mut self) { + match self.buffer_is_empty() { + true => self.state.select(None), + false => self.state.select(Some(self.buffer_len() - 1)), + } + } + + pub fn on_new_record(&mut self, count: (usize, usize, usize)) -> Result<(), TuiError> { + self.count = count; + let length = count.2; + let empty_buffer = length == 0; + if self.follow && !empty_buffer { + self.state.select(Some(length - 1)); + } + if self.state.selected().is_none() && !empty_buffer { + self.state.select(Some(0)); + } + if let Some(s) = self.state.selected() { + if s >= length { + let ii = match empty_buffer { + true => 0, + false => length - 1, + }; + self.state.select(Some(ii)); + } + } + Ok(()) + } + + fn buffer_key_event(&mut self, e: KeyEvent) -> Result<(), TuiError> { + self.key_events_buffer.push(e); + if self.key_events_buffer.len() < 2 { + return Ok(()); + } + self.key_events_buffer.dedup(); + if self.key_events_buffer.len() == 1 { + match self.key_events_buffer.first().unwrap().code { + KeyCode::Char('g') => { + self.follow(false)?; + self.first(); + } + KeyCode::Char('G') => { + self.follow(false)?; + self.last(); + } + _ => (), + } + } + self.key_events_buffer.clear(); + Ok(()) + } + + fn follow(&mut self, follow: bool) -> Result<(), TuiError> { + self.follow = follow; + if self.follow { + self.state.select(match self.buffer_len() { + 0 => None, + i => Some(i - 1), + }); + } + self.action_tx + .as_ref() + .unwrap() + .send(Action::RefreshShortcuts)?; + Ok(()) + } + + fn truncate_value(value: &str, rect: &Rect) -> String { + let split_at = (rect.width.checked_sub(70)).unwrap_or(3) as usize; + match value.len() > split_at { + true => value.chars().take(split_at).collect(), + false => value.to_string(), + } + } +} + +impl Component for RecordsComponent<'_> { + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.action_tx = Some(tx.clone()); + Ok(()) + } + + fn id(&self) -> ComponentName { + ComponentName::Records + } + + fn handle_key_events(&mut self, key: KeyEvent) -> Result, TuiError> { + match key.code { + KeyCode::Char('c') => { + if let Some(s) = self.state.selected() { + let r = self.records.lock().unwrap(); + let record = r.get(s).unwrap(); + let mut ctx = ClipboardContext::new().unwrap(); + let exported_record: ExportedKafkaRecord = record.into(); + self.action_tx.as_ref().unwrap().send(Action::Notification( + Notification::new(log::Level::Info, "Copied to clipboard".to_string()), + ))?; + ctx.set_contents(serde_json::to_string_pretty(&exported_record)?) + .unwrap(); + } + } + KeyCode::Char('f') => self.follow(!self.follow)?, + KeyCode::Char('v') | KeyCode::Enter => { + self.show_details()?; + } + KeyCode::Char('e') if key.modifiers.contains(KeyModifiers::CONTROL) => { + let records = self.records.lock().unwrap(); + for record in records.iter() { + self.action_tx + .as_ref() + .unwrap() + .send(Action::Export(record.clone()))?; + } + } + KeyCode::Char('e') => { + if let Some(s) = self.state.selected() { + let r = self.records.lock().unwrap(); + let record = r.get(s).unwrap(); + self.action_tx + .as_ref() + .unwrap() + .send(Action::Export(record.clone()))?; + } + } + KeyCode::Char('o') => { + if let Some(s) = self.state.selected() { + let r = self.records.lock().unwrap(); + let record = r.get(s).unwrap(); + self.action_tx + .as_ref() + .unwrap() + .send(Action::Open(record.clone()))?; + } + } + KeyCode::Char('[') => { + self.follow(false)?; + self.first(); + } + KeyCode::Char(']') => { + self.follow(false)?; + self.last(); + } + KeyCode::Down => { + self.follow(false)?; + self.next(); + self.set_event_dialog()?; + } + KeyCode::Up => { + self.follow(false)?; + self.previous(); + self.set_event_dialog()?; + } + KeyCode::Char('g') | KeyCode::Char('G') => self.buffer_key_event(key)?, + _ => (), + } + Ok(None) + } + + fn update(&mut self, action: Action) -> Result, TuiError> { + let mut a = self.buffer_tx.clone(); + let BufferAction::Count(count) = *a.borrow_and_update(); + let _ = self.on_new_record(count); + match action.clone() { + Action::NewConsumer() => { + self.count = (0, 0, 0); + } + Action::Tick => self.status.calc_next(), + Action::SelectedTopics(topics) => self.selected_topics = topics.len(), + Action::Consuming => self.consuming = true, + Action::StopConsuming() => { + self.consuming = false; + self.count = (0, 0, 0); + } + Action::Search(search_query) => { + self.state.select(None); + self.search_query = search_query; + } + _ => (), + }; + Ok(None) + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + let focused = state.is_focused(self.id()); + let block = Block::default() + .borders(Borders::ALL) + .border_type(BorderType::Rounded) + .title(" Records "); + + let block = self.make_block_focused_with_state(state, block); + + let normal_style = Style::default(); + let header_cells = vec![ + Cell::new(Text::from("Timestamp")), + Cell::new(Text::from("Offset").alignment(Alignment::Right)), + Cell::new(Text::from("Partition").alignment(Alignment::Right)), + Cell::new(Text::from("Topic").alignment(Alignment::Right)), + Cell::new(Text::from("Key").alignment(Alignment::Right)), + Cell::new(Text::from("Value")), + ]; + let header = Row::new(header_cells) + .style(normal_style) + .height(1) + .bottom_margin(1); + let r = self.records.lock().unwrap(); + + // TODO render only records in the viewport + let rows = r.iter().enumerate().map(|(index, item)| { + if let Some(s) = self.state.selected() { + let is_visible = (s + rect.height as usize) > index + && s.saturating_sub(rect.height as usize) <= index; + if !is_visible { + return Row::new(Vec::::new()).height(1_u16); + } + } + let cells = vec![ + Cell::new(Text::from( + item.timestamp_as_local_date_time() + .map(|e| e.to_rfc3339_opts(chrono::SecondsFormat::Millis, false)) + .unwrap_or("".to_string()), + )), + Cell::new(Text::from(item.offset.to_string()).alignment(Alignment::Right)), + Cell::new(Text::from(item.partition.to_string()).alignment(Alignment::Right)), + Cell::new(Text::from(Self::shorten_topic(&item.topic)).alignment(Alignment::Right)), + Cell::new(Text::from(item.key_as_string.to_string()).alignment(Alignment::Right)), + Cell::new(Text::from(Self::truncate_value( + &item.value_as_string, + &rect, + ))), + ]; + Row::new(cells).height(1_u16) + }); + let table = Table::new( + rows, + [ + Constraint::Min(30), + Constraint::Min(10), + Constraint::Min(10), + Constraint::Min(11), + Constraint::Min(12), + Constraint::Percentage(100), + ], + ) + .header(header) + .row_highlight_style(match focused { + true => Style::default() + .bg(state.theme.bg_focused_selected) + .fg(state.theme.fg_focused_selected) + .bold(), + false => Style::default() + .bg(state.theme.bg_unfocused_selected) + .fg(state.theme.fg_unfocused_selected), + }); + + let metrics = Span::styled( + format!( + " {} / {} ", + self.count.0.separate_with_underscores(), + self.count.1.separate_with_underscores() + ), + Style::default(), + ); + let inner = block.inner(rect); + f.render_widget(block, rect); + + f.render_stateful_widget(table, inner, &mut self.state); + let metrics_area = Rect::new( + inner + .right() + .checked_sub(u16::try_from(metrics.width())?) + .unwrap_or(1000) + .checked_sub(10) + .unwrap_or(1000), + inner.y, + metrics.width() as u16, + 1, + ); + + if self.consuming && self.count.1 != 0 { + f.render_widget(metrics, metrics_area); + } + if self.consuming { + let simple = throbber_widgets_tui::Throbber::default(); + let ss = Span::styled( + " Live ", + Style::default() + .fg(state.theme.white) + .bg(state.theme.orange), + ) + .bold(); + f.render_widget( + ss, + Rect::new(inner.right().saturating_sub(9), inner.y, 8, 1), + ); + f.render_stateful_widget( + simple, + Rect::new(inner.right().saturating_sub(3), inner.y, 2, 1), + &mut self.status, + ); + } + Ok(()) + } + + fn shortcuts(&self) -> Vec { + let shortcuts = vec![ + Shortcut::new("C", "Copy"), + Shortcut::new("O", "Open"), + // Shortcut::new("[", "First record"), + // Shortcut::new("]", "Last record"), + Shortcut::new("E", "Export"), + Shortcut::new( + "f", + match self.follow { + true => "Unfollow", + false => "Follow", + }, + ), + //Shortcut::new("↑↓", "Scroll"), + ]; + + shortcuts + } +} diff --git a/crates/tui/src/component/root_component.rs b/crates/tui/src/component/root_component.rs new file mode 100644 index 0000000..5c4c5b3 --- /dev/null +++ b/crates/tui/src/component/root_component.rs @@ -0,0 +1,401 @@ +//! This component is handles the main layout of the TUI +//! and renders components based on the current context. +use app::Config; +use copypasta::{ClipboardContext, ClipboardProvider}; +use lib::ExportedKafkaRecord; +use std::{ + collections::HashMap, + sync::{Arc, Mutex}, +}; + +use crossterm::event::{KeyCode, KeyEvent, KeyModifiers}; +use ratatui::{ + layout::{Constraint, Direction, Layout, Margin, Rect}, + widgets::Clear, + Frame, +}; +use tokio::sync::{mpsc::UnboundedSender, watch::Receiver}; + +use crate::{error::TuiError, records_buffer::BufferAction, Action, Notification}; + +use super::{ + footer_component::FooterComponent, help_component::HelpComponent, + progress_bar_component::ProgressBarComponent, record_details_component::RecordDetailsComponent, + records_component::RecordsComponent, search_component::SearchComponent, + topic_details_component::ScrollableTopicDetailsComponent, + topics_and_records_component::TopicsAndRecordsComponent, topics_component::TopicsComponent, + Component, ComponentName, ConcurrentRecordsBuffer, State, +}; + +pub struct RootComponent { + components: HashMap>>, + views: Vec, + state: State, + focus_history: Vec, + focus_order: Vec, + progress_bar: ProgressBarComponent, + buffer_rx: Receiver, + action_tx: Option>, +} + +impl RootComponent { + #[allow(clippy::arc_with_non_send_sync)] + pub fn new( + query: String, + selected_topics: Vec, + config: &Config, + records: &'static ConcurrentRecordsBuffer, + state: State, + ) -> Self { + let buffer_rx = records.lock().map(|e| e.channels.clone().1).ok().unwrap(); + let mut footer = FooterComponent::default(); + footer.show_shortcuts(config.show_shortcuts); + + let mut components: [Arc>; 8] = [ + Arc::new(Mutex::new(TopicsComponent::new(selected_topics))), + Arc::new(Mutex::new(RecordsComponent::new(records))), + Arc::new(Mutex::new(ScrollableTopicDetailsComponent::default())), + Arc::new(Mutex::new(RecordDetailsComponent::new(&state))), + Arc::new(Mutex::new(SearchComponent::new( + &query, + config.history.clone(), + ))), + Arc::new(Mutex::new(footer)), + Arc::new(Mutex::new(HelpComponent::default())), + Arc::new(Mutex::new(FooterComponent::default())), + ]; + + components[components.len() - 1] = Arc::new(Mutex::new(TopicsAndRecordsComponent::new( + components[0].clone(), + components[1].clone(), + ))); + + let components: HashMap>> = components + .into_iter() + .map(|c| { + let id = c.lock().unwrap().id(); + (id, c) + }) + .collect(); + + Self { + components, + buffer_rx, + progress_bar: ProgressBarComponent::new(400), + views: vec![ComponentName::TopicsAndRecords], + focus_order: focus_order_of(&ComponentName::TopicsAndRecords), + focus_history: vec![], + state, + action_tx: Default::default(), + } + } + + fn focus_next(&mut self, current: &ComponentName) -> ComponentName { + let index = self + .focus_order + .iter() + .position(|e| e == current) + .unwrap_or(self.focus_order.len() - 1); + match index == self.focus_order.len() - 1 { + true => self.focus_order.first().unwrap().clone(), + false => self.focus_order[index + 1].clone(), + } + } + + fn focus(&mut self, current: ComponentName) -> Result<(), TuiError> { + self.state.focused = current; + let mut shortcuts = self + .components + .get(&self.state.focused) + .unwrap() + .lock() + .unwrap() + .shortcuts(); + + if self.state.focused == ComponentName::RecordDetails { + shortcuts.extend( + self.components + .get(&ComponentName::Records) + .unwrap() + .lock() + .unwrap() + .shortcuts(), + ); + } + + self.action_tx + .as_ref() + .unwrap() + .send(Action::Shortcuts(shortcuts, self.views.len() == 1))?; + Ok(()) + } + + fn close(&mut self) { + self.views.pop(); + if self.views.is_empty() { + self.action_tx.as_ref().unwrap().send(Action::Quit).unwrap(); + } else { + self.focus_order = focus_order_of(self.views.last().unwrap()); + + let last_focused_component = self + .focus_history + .pop() + .unwrap_or(self.focus_order.first().unwrap().clone()); + + let focus = match self.focus_order.contains(&self.state.focused) { + true => self.state.focused.clone(), + false => last_focused_component, + }; + self.focus(focus).unwrap(); + } + self.notify_footer().unwrap(); + } + + fn toggle_view(&mut self, view: ComponentName) -> Result<(), TuiError> { + if self.views.last().unwrap() == &view { + self.close(); + } else { + self.action_tx + .as_ref() + .unwrap() + .send(Action::NewView(view))?; + } + Ok(()) + } + + fn focus_previous(&mut self, current: &ComponentName) -> ComponentName { + let index = self + .focus_order + .iter() + .position(|e| e == current) + .unwrap_or(1); + match index == 0 { + true => self.focus_order.last().unwrap().clone(), + false => self.focus_order[index - 1].clone(), + } + } + + fn notify_footer(&self) -> Result<(), TuiError> { + if !self.views.is_empty() { + self.action_tx.as_ref().unwrap().send(Action::ViewStack(( + self.views.first().unwrap().clone(), + self.focus_history.clone(), + )))?; + } + Ok(()) + } +} + +impl Component for RootComponent { + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.action_tx = Some(tx.clone()); + for component in self.components.values_mut() { + component + .lock() + .unwrap() + .register_action_handler(tx.clone())?; + } + self.notify_footer()?; + self.action_tx.as_ref().unwrap().send(Action::Shortcuts( + self.components + .get(&self.state.focused) + .unwrap() + .lock() + .unwrap() + .shortcuts(), + true, + ))?; + Ok(()) + } + + fn id(&self) -> ComponentName { + ComponentName::Main + } + + fn handle_key_events(&mut self, key: KeyEvent) -> Result, TuiError> { + match key.code { + KeyCode::Tab => { + let new_focus = self.focus_next(&self.state.focused.clone()); + self.focus(new_focus)?; + } + KeyCode::BackTab => { + let new_focus = self.focus_previous(&self.state.focused.clone()); + self.focus(new_focus)?; + } + KeyCode::Char('r') if key.modifiers.contains(KeyModifiers::CONTROL) => { + self.action_tx.as_ref().unwrap().send(Action::Refresh)?; + } + KeyCode::Char('f') if key.modifiers.contains(KeyModifiers::CONTROL) => { + self.state.focused = ComponentName::Search; + return Ok(None); + } + KeyCode::Char('/') | KeyCode::Char(':') + if self.state.focused != ComponentName::Search => + { + self.state.focused = ComponentName::Search; + return Ok(None); + } + KeyCode::Char('c') if key.modifiers.contains(KeyModifiers::CONTROL) => { + self.action_tx.as_ref().unwrap().send(Action::Quit)?; + } + KeyCode::Char('o') + if key.modifiers.contains(KeyModifiers::CONTROL) && self.views.len() == 1 => + { + match self.views.first().unwrap() { + ComponentName::Records => self.views[0] = ComponentName::TopicsAndRecords, + ComponentName::TopicsAndRecords => self.views[0] = ComponentName::Records, + _ => unreachable!("nope nope"), + } + self.notify_footer()?; + if self.views.len() == 1 { + self.focus_order = focus_order_of(&self.views[0]); + self.state.focused = match self.focus_order.contains(&self.state.focused) { + true => self.state.focused.clone(), + false => self.focus_order.first().unwrap().clone(), + }; + self.action_tx + .as_ref() + .unwrap() + .send(Action::RefreshShortcuts)?; + } + return Ok(None); + } + KeyCode::Char('h') if key.modifiers.contains(KeyModifiers::CONTROL) => { + self.toggle_view(ComponentName::Help)?; + return Ok(None); + } + KeyCode::Esc => self.close(), + _ => (), + }; + let focused_component = self.components.get(&self.state.focused).unwrap(); + focused_component.lock().unwrap().handle_key_events(key)?; + if self.state.focused == ComponentName::RecordDetails { + self.components + .get(&ComponentName::Records) + .unwrap() + .lock() + .unwrap() + .handle_key_events(key)?; + } + Ok(None) + } + + fn update(&mut self, action: Action) -> Result, TuiError> { + match action { + Action::RefreshShortcuts => { + let focused = self.state.focused.clone(); + self.focus(focused)?; + } + Action::NewView(ref action) => { + if action == &ComponentName::RecordDetails + && self.state.focused == ComponentName::RecordDetails + { + return Ok(None); + } + self.focus_history.push(self.state.focused.clone()); + self.focus_history.dedup(); + let last_focused = self.focus_history.last().unwrap().clone(); + self.focus_order = focus_order_of(action); + self.views = self + .views + .iter() + .filter(|a| a != &action) + .cloned() + .collect(); + + self.views.push(action.clone()); + match self.focus_order.contains(&last_focused) { + true => self.focus(last_focused), + false => self.focus(self.focus_order.first().unwrap().clone()), + }?; + self.notify_footer()?; + } + Action::RecordsToRead(length) => { + self.progress_bar.set_length(length); + } + Action::Close(_) => self.close(), + Action::CopyToClipboard(ref record) => { + let mut ctx = ClipboardContext::new().unwrap(); + let exported_record: ExportedKafkaRecord = record.into(); + self.action_tx + .as_ref() + .unwrap() + .send(Action::Notification(Notification::new( + log::Level::Info, + "Copied to clipboard".to_string(), + )))?; + ctx.set_contents(serde_json::to_string_pretty(&exported_record)?) + .unwrap(); + } + _ => (), + } + for component in self.components.values_mut() { + component.lock().unwrap().update(action.clone())?; + } + Ok(None) + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, _: &State) -> Result<(), TuiError> { + let mut a = self.buffer_rx.clone(); + let BufferAction::Count(count) = *a.borrow_and_update(); + self.progress_bar.set_progress(count.1); + + f.render_widget(Clear, rect); + + let main_component = self.components.get(self.views.last().unwrap()).unwrap(); + + let chunks: std::rc::Rc<[Rect]> = Layout::default() + .direction(Direction::Vertical) + .constraints([ + Constraint::Percentage(100), + Constraint::Min(3), + Constraint::Min(3), + ]) + .split(Rect::new( + rect.x + 2, + rect.y + 1, + rect.width.saturating_sub(6), + rect.height.saturating_sub(0), + )); + + //if self.state.focused == ComponentName::Search { + // search_block = self.make_block_focused(&self.state, search_block); + //}; + main_component + .lock() + .unwrap() + .draw(f, chunks[0], &self.state)?; + self.components + .get_mut(&ComponentName::Search) + .unwrap() + .lock() + .unwrap() + .draw(f, chunks[1], &self.state)?; + self.components + .get_mut(&ComponentName::Footer) + .unwrap() + .lock() + .unwrap() + .draw(f, chunks[2].inner(Margin::new(1, 0)), &self.state)?; + + f.render_widget(self.progress_bar.clone(), rect); + //f.render_widget(search_block, chunks[1]); + + Ok(()) + } +} + +pub fn focus_order_of(component: &ComponentName) -> Vec { + match component { + ComponentName::RecordDetails => vec![ComponentName::RecordDetails, ComponentName::Search], + ComponentName::Records => vec![ComponentName::Records, ComponentName::Search], + ComponentName::TopicsAndRecords => vec![ + ComponentName::Topics, + ComponentName::Records, + ComponentName::Search, + ], + ComponentName::TopicDetails => vec![ComponentName::TopicDetails, ComponentName::Search], + ComponentName::Help => vec![ComponentName::Help, ComponentName::Search], + _ => vec![], + } +} diff --git a/crates/tui/src/component/search_component.rs b/crates/tui/src/component/search_component.rs new file mode 100644 index 0000000..a13402b --- /dev/null +++ b/crates/tui/src/component/search_component.rs @@ -0,0 +1,297 @@ +//! This component renders the search bar. +//! It comes with the following features: +//! - all queries are stored into a history. +//! - The component suggests queries based on your history. + +use std::{str::FromStr, time::Duration}; + +use app::search::ValidSearchQuery; +use crossterm::event::{Event, KeyCode, KeyEvent, KeyModifiers}; +use itertools::Itertools; +use lib::{error::SearchError, Error}; +use ratatui::{ + layout::{Position, Rect}, + style::{Style, Stylize}, + text::{Line, Span}, + widgets::{Block, BorderType, Borders, Clear, Padding, Paragraph, Wrap}, + Frame, +}; +use tokio::{select, sync::mpsc::UnboundedSender, time::Instant}; +use tokio_util::sync::CancellationToken; +use tui_input::{backend::crossterm::EventHandler, Input}; + +use crate::{ + error::TuiError, + {Action, Notification}, +}; + +use super::{Component, ComponentName, Shortcut, State}; + +#[derive(Default)] +pub struct SearchComponent { + pub input: Input, + pub index_history: usize, + pub history: Vec, + pub compiler_worker: CancellationToken, + pub remaining_input: Option, + pub action_tx: Option>, + pub autocomplete: Option, + // A hack to detect copy-paste events and replace \n with a space + pub entered: Option, +} + +impl SearchComponent { + pub fn new(input: &str, history: Vec) -> Self { + Self { + input: Input::from(input), + index_history: history.len() - 1, + history, + ..Self::default() + } + } + + fn parse_input(&mut self) { + let input = self.input.value().to_string(); + let tt = self.action_tx.clone(); + + self.compiler_worker.cancel(); + self.compiler_worker = CancellationToken::new(); + let token = self.compiler_worker.clone(); + tokio::spawn(async move { + select! { + _ = token.cancelled() => { }, + _ = tokio::time::sleep(Duration::from_millis(700)) => { + if input.len() > 5 { + tt.as_ref().unwrap().send(Action::ResetNotification()).unwrap(); + if let Err(e) = ValidSearchQuery::from_str(&input) { + tt.as_ref().unwrap().send(Action::Notification(Notification::new(log::Level::Error, e.to_string()))).unwrap(); + } + } + } + } + }); + } + + fn autocomplete(&mut self, keycode: KeyCode) { + let prompt = self.input.value(); + + let mut possibilities = self + .history + .iter() + .filter(|e| e.starts_with(prompt)) + .collect_vec(); + possibilities.dedup(); + + self.index_history = match keycode { + KeyCode::Up => match self.index_history == 0 { + true => 0, + false => self.index_history - 1, + }, + KeyCode::Down => match self.index_history == possibilities.len().saturating_sub(1) { + true => self.index_history, + false => self.index_history + 1, + }, + _ => match possibilities.is_empty() { + true => 0, + false => possibilities.len() - 1, + }, + }; + self.autocomplete = possibilities + .get(self.index_history) + .map(|e| e.split_at(prompt.len()).1.to_string()); + } + + fn update_history(&mut self, prompt: &str) -> Result<(), TuiError> { + if !self.history.contains(&prompt.to_string()) { + self.history.push(prompt.into()); + self.index_history = self.history.len() - 1; + self.action_tx + .as_ref() + .unwrap() + .send(Action::NewSearchPrompt(prompt.to_string()))?; + } + Ok(()) + } + + fn search(&mut self) -> Result<(), TuiError> { + let o = self.input.value().to_string(); + match ValidSearchQuery::from_str(o.as_str()) { + Ok(search_query) => { + self.update_history(&o)?; + self.action_tx + .as_ref() + .unwrap() + .send(Action::Notification(Notification::new( + log::Level::Info, + match search_query.is_empty() { + true => "Waiting for new events".to_string(), + false => "Searching".to_string(), + }, + )))?; + + self.action_tx + .clone() + .unwrap() + .send(Action::Search(search_query))?; + } + + Err(e) => { + if let Error::Search(SearchError::Parse(ee)) = &e { + self.remaining_input = Some(ee.to_string()); + } + + self.action_tx + .as_ref() + .unwrap() + .send(Action::Notification(Notification::new( + log::Level::Error, + e.to_string(), + )))?; + } + }; + Ok(()) + } + + #[allow(dead_code)] + fn pretty_error_message(error: &nom::Err>) -> String { + match error { + nom::Err::Incomplete(_) => "Cannot parse query".to_string(), + nom::Err::Error(s) => format!("unexpected token '{}'", s.input), + nom::Err::Failure(s) => format!("unexpected token '{}'", s.input), + } + } +} + +impl Component for SearchComponent { + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.action_tx = Some(tx); + self.search()?; + Ok(()) + } + + fn id(&self) -> ComponentName { + ComponentName::Search + } + + fn handle_key_events(&mut self, key: KeyEvent) -> Result, TuiError> { + self.remaining_input = None; + match key.code { + KeyCode::Right => { + if self.input.value().len() == self.input.cursor() { + if let Some(a) = &self.autocomplete { + let input = self.input.value().to_string(); + let autocompleted = format!("{}{}", input, a); + self.input = self + .input + .clone() + .with_cursor(autocompleted.len()) + .with_value(autocompleted); + self.autocomplete = None; + self.compiler_worker.cancel(); + } + } + self.input.handle_event(&Event::Key(key)); + } + KeyCode::Up => self.autocomplete(KeyCode::Up), + KeyCode::Down => self.autocomplete(KeyCode::Down), + KeyCode::Enter => { + self.entered = Some(Instant::now()); + self.search()?; + self.autocomplete = None; + } + _ if !key.modifiers.contains(KeyModifiers::CONTROL) => { + if let Some(e) = self.entered { + if e.elapsed() < Duration::from_millis(10) { + self.input.handle_event(&Event::Key(KeyEvent::new( + KeyCode::Char(' '), + KeyModifiers::NONE, + ))); + } + self.entered = None; + } + self.input.handle_event(&Event::Key(key)); + self.parse_input(); + self.autocomplete(KeyCode::Backspace); + self.action_tx + .as_ref() + .unwrap() + .send(Action::ResetNotification())?; + } + _ => { + self.input.handle_event(&Event::Key(key)); + } + }; + Ok(None) + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + let padding = 1; + let input: &str = self.input.value(); + let mut line: Line = match &self.remaining_input { + Some(e) => { + let parts = input.split_at(input.len() - e.len()); + Line::from(vec![ + Span::raw(parts.0), + Span::styled( + parts.1, + Style::default() + .fg(state.theme.orange) + .not_bold() + .underlined(), + ), + ]) + } + None => Line::from(vec![input.into()]), + }; + + if let Some(a) = &self.autocomplete { + line.push_span(Span::styled( + a, + Style::default().fg(state.theme.autocomplete).not_bold(), + )); + } + + let position_cursor = Position { + x: rect.x + 2 + (self.input.visual_cursor()) as u16, + y: rect.y + 1, + }; + + // let rect = rect; + //if rect.width < input.len() as u16 { + // let lines = input.len().div_ceil((rect.width - 15) as usize) as u16; + // info!("Lines {:?}", lines); + // rect.height = 3 + lines; + // rect.y = rect.y - lines; + // position_cursor = Position { + // x: rect.x + 1 + (self.input.visual_cursor()) as u16, + // y: rect.y + 1, + // } + //} + + let block = Block::default() + .borders(Borders::ALL) + .border_type(BorderType::Rounded) + .padding(Padding::left(1)) + .title(" Search "); + let block = self.make_block_focused_with_state(state, block); + + let selected_style = Paragraph::new(line) + .block(Block::new().padding(Padding::left(padding))) + .wrap(Wrap { trim: false }); + + let paragraph = selected_style.block(block); + if state.is_focused(self.id()) { + f.set_cursor_position(position_cursor); + } + f.render_widget(Clear, rect); + f.render_widget(paragraph, rect); + Ok(()) + } + + fn shortcuts(&self) -> Vec { + vec![ + Shortcut::new("↑↓", "History"), + Shortcut::new("ENTER", "Search"), + ] + } +} diff --git a/crates/tui/src/component/shortcut.rs b/crates/tui/src/component/shortcut.rs new file mode 100644 index 0000000..b740372 --- /dev/null +++ b/crates/tui/src/component/shortcut.rs @@ -0,0 +1,19 @@ +use serde::Deserialize; + +/// Shortcuts are keyboards shortcuts available to the user to interact with the UI. For instance: +/// ```ignore +/// let shortcut = Shortcut::new("CTRL + P", "Show details"); +/// ``` +/// will be visible as `[CTRL + P]: Show details` in the footer. + +#[derive(Debug, Clone, PartialEq, Eq, Deserialize)] +pub struct Shortcut { + pub key: &'static str, + pub description: &'static str, +} + +impl Shortcut { + pub fn new(key: &'static str, description: &'static str) -> Self { + Self { key, description } + } +} diff --git a/crates/tui/src/component/state.rs b/crates/tui/src/component/state.rs new file mode 100644 index 0000000..b67c3fe --- /dev/null +++ b/crates/tui/src/component/state.rs @@ -0,0 +1,45 @@ +//! The state is a struct containing various information. +//! It is passed to all components. +use app::Config; +use std::path::PathBuf; + +use crate::theme::Theme; + +use super::ComponentName; + +#[derive(Clone)] +pub struct State { + pub focused: ComponentName, + pub cluster: String, + pub themes: Vec, + pub theme: Theme, + pub configuration_file: PathBuf, + pub logs_file: PathBuf, + pub themes_file: PathBuf, + pub filters_dir: PathBuf, +} + +impl Default for State { + fn default() -> Self { + Self::new("localhost", Theme::light(), &Config::default()) + } +} + +impl State { + pub fn new(cluster: &str, theme: Theme, config: &Config) -> Self { + Self { + focused: ComponentName::default(), + cluster: cluster.to_string(), + theme, + themes: config.themes(), + themes_file: config.themes_file(), + configuration_file: config.path.clone(), + logs_file: config.logs_file(), + filters_dir: config.filters_dir(), + } + } + + pub fn is_focused(&self, component_name: ComponentName) -> bool { + self.focused == component_name + } +} diff --git a/crates/tui/src/component/topic_details_component.rs b/crates/tui/src/component/topic_details_component.rs new file mode 100644 index 0000000..2b88931 --- /dev/null +++ b/crates/tui/src/component/topic_details_component.rs @@ -0,0 +1,239 @@ +//! Component showing information regarding a given topic: partitions, consumer groups, replicas ... +use crossterm::event::KeyEvent; + +use itertools::Itertools; +use lib::{ConsumerGroupState, TopicDetail}; +use ratatui::{ + layout::{Alignment, Constraint, Margin, Offset, Rect}, + style::{Modifier, Style, Stylize}, + text::{Line, Span, Text}, + widgets::{ + Block, BorderType, Borders, Cell, Clear, Padding, Paragraph, Row, Table, TableState, + }, + Frame, +}; +use tokio::sync::mpsc::UnboundedSender; + +use crate::{error::TuiError, Action}; + +use super::{ + vertical_scrollable_block::VerticalScrollableBlock, Component, ComponentName, State, WithHeight, +}; + +pub type ScrollableTopicDetailsComponent = VerticalScrollableBlock; + +#[derive(Default)] +pub struct TopicDetailsComponent { + pub details: Vec, + pub action_tx: Option>, + pub state: TableState, + throbber_state: throbber_widgets_tui::ThrobberState, +} + +impl WithHeight for TopicDetailsComponent { + fn content_height(&self) -> usize { + self.details + .iter() + .map(|e| e.consumer_groups.len()) + .sum::() + + 5 + } +} + +impl Component for TopicDetailsComponent { + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.action_tx = Some(tx); + Ok(()) + } + + fn id(&self) -> ComponentName { + ComponentName::TopicDetails + } + + fn handle_key_events(&mut self, _key: KeyEvent) -> Result, TuiError> { + Ok(None) + } + + fn update(&mut self, action: Action) -> Result, TuiError> { + match action { + Action::Tick => self.throbber_state.calc_next(), + Action::TopicDetails(details) => { + self.details = details; + } + _ => (), + }; + Ok(None) + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + let block = Block::new() + .borders(Borders::ALL) + .border_style(Style::default()) + .title(" Topic details ") + .padding(Padding::proportional(3)) + .border_type(BorderType::Rounded); + let block = self.make_block_focused_with_state(state, block); + + if self.details.is_empty() { + f.render_widget(Clear, rect); + let full = throbber_widgets_tui::Throbber::default() + .label("This feature is not ready yet. Fetching data, please wait...") + .style(Style::default()) + .throbber_style(Style::default().add_modifier(Modifier::BOLD)) + .throbber_set(throbber_widgets_tui::BRAILLE_DOUBLE) + .use_type(throbber_widgets_tui::WhichUse::Spin); + f.render_widget(block, rect); + f.render_stateful_widget( + full, + rect.inner(Margin::new(6, 4)), + &mut self.throbber_state, + ); + return Ok(()); + } + + if !self.details.is_empty() { + f.render_widget(Clear, rect); + let header_cells = vec![ + Cell::new(Text::from("")), + Cell::new(Text::from("Name")), + Cell::new(Text::from("State")), + Cell::new(Text::from("Partitions").alignment(Alignment::Right)), + Cell::new(Text::from("Members").alignment(Alignment::Right)), + Cell::new(Text::from("Lag").alignment(Alignment::Right)), + ]; + + let header = Row::new(header_cells).bold().height(1); + let mut rows = vec![]; + + for detail in &self.details { + let consumers_groups = detail.consumer_groups.clone(); + rows.extend( + consumers_groups + .into_iter() + .sorted_by(|a, b| a.name.cmp(&b.name)) + .enumerate() + .map(|item| { + Row::new(vec![ + Cell::new( + match item.1.state { + ConsumerGroupState::Unknown => { + Span::styled("⊘", Style::default().fg(state.theme.red)) + } + ConsumerGroupState::Empty => { + Span::styled("◯", Style::default().fg(state.theme.red)) + } + ConsumerGroupState::Dead => { + Span::styled("⊗", Style::default().fg(state.theme.red)) + } + ConsumerGroupState::Stable => Span::styled( + "⏺︎", + Style::default().fg(state.theme.green), + ), + ConsumerGroupState::PreparingRebalance => Span::styled( + "⦿", + Style::default().fg(state.theme.yellow), + ), + ConsumerGroupState::CompletingRebalance => Span::styled( + "⦿", + Style::default().fg(state.theme.yellow), + ), + ConsumerGroupState::Rebalancing => Span::styled( + "⦿", + Style::default().fg(state.theme.yellow), + ), + ConsumerGroupState::UnknownRebalance => Span::styled( + "⊘", + Style::default().fg(state.theme.black), + ), + } + .into_right_aligned_line(), + ), + Cell::new(Span::styled(item.1.name.clone(), Style::default())), + Cell::new(Span::styled(item.1.state.to_string(), Style::default())), + Cell::new( + Span::styled( + item.1.members.len().to_string(), + Style::default(), + ) + .into_right_aligned_line(), + ), + Cell::new( + Span::styled("1", Style::default()).into_right_aligned_line(), + ), + Cell::new( + Span::styled("?", Style::default()).into_right_aligned_line(), + ), + ]) + .height(1_u16) + }), + ); + } + + let table = Table::new( + rows, + [ + Constraint::Length(1), + Constraint::Length(42), + Constraint::Length(24), + Constraint::Length(10), + Constraint::Length(32), + Constraint::Length(6), + ], + ) + .column_spacing(2) + .header(header.clone()); + + let table_area = block.inner(rect); + + let detail = self.details.first().unwrap(); + + let text = vec![ + Line::from(detail.name.clone()).style(Style::default().bold()), + Line::from(format!( + "{} partitions, {} replicas", + detail.partitions, detail.replicas + )) + .style(Style::default()), + Line::from(format!("{} consumer groups", detail.consumer_groups.len())) + .style(Style::default()), + Line::from(""), + Line::from(""), + ]; + + //let main_paragraph = Paragraph::new(text.clone()) + // .style(Style::default()); + + f.render_stateful_widget( + table, + table_area.offset(Offset { x: 0, y: 5 }), + &mut self.state, + ); + f.render_widget( + Paragraph::new(text) + .style(Style::default()) + .block(block.clone()), + rect, + ); + + // + // let mut text: Vec> = vec![]; + // for d in &self.details { + // text.push(Line::from(format!( + // "{} - {} {}", + // d.0, + // d.1, + // match d.1 > 1 { + // true => "partitions", + // false => "partition", + // } + // ))); + // for (k, v) in &d.2 { + // text.push(Line::from(format!("{}: lag of {}", k, v))); + // } + // } + // + } + + Ok(()) + } +} diff --git a/crates/tui/src/component/topics_and_records_component.rs b/crates/tui/src/component/topics_and_records_component.rs new file mode 100644 index 0000000..c999bcb --- /dev/null +++ b/crates/tui/src/component/topics_and_records_component.rs @@ -0,0 +1,42 @@ +//! This component is a layout component that renders `[TopicsComponent]` and `[RecordsComponent]`. +use ratatui::{ + layout::{Constraint, Direction, Layout, Rect}, + widgets::Clear, + Frame, +}; +use std::sync::{Arc, Mutex}; + +use crate::error::TuiError; + +use super::{Component, ComponentName, State}; + +pub struct TopicsAndRecordsComponent { + records: Arc>, + topics: Arc>, +} + +impl TopicsAndRecordsComponent { + pub fn new(topics: Arc>, records: Arc>) -> Self { + Self { records, topics } + } +} + +impl Component for TopicsAndRecordsComponent { + fn id(&self) -> ComponentName { + ComponentName::TopicsAndRecords + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + f.render_widget(Clear, rect); + + let chunks: std::rc::Rc<[Rect]> = Layout::default() + .direction(Direction::Horizontal) + .constraints([Constraint::Min(62), Constraint::Percentage(100)]) + .spacing(1) + .split(rect); + + self.topics.lock().unwrap().draw(f, chunks[0], state)?; + self.records.lock().unwrap().draw(f, chunks[1], state)?; + Ok(()) + } +} diff --git a/crates/tui/src/component/topics_component.rs b/crates/tui/src/component/topics_component.rs new file mode 100644 index 0000000..7d7fa1b --- /dev/null +++ b/crates/tui/src/component/topics_component.rs @@ -0,0 +1,292 @@ +//! Component listing all the kafa topics that can be consumed +use std::collections::HashSet; + +use crossterm::event::{Event, KeyCode, KeyEvent, KeyModifiers}; +use itertools::Itertools; +use ratatui::{ + layout::{Constraint, Direction, Layout, Position, Rect}, + style::{Style, Stylize}, + widgets::{Block, BorderType, Borders, Clear, List, ListItem, ListState, Padding, Paragraph}, + Frame, +}; +use tokio::sync::mpsc::UnboundedSender; +use tui_input::{backend::crossterm::EventHandler, Input}; + +use crate::{error::TuiError, Action}; + +use super::{Component, ComponentName, Shortcut, State}; + +#[derive(Default)] +pub struct TopicsComponent { + pub topics: Vec, + pub visible_topics: Vec, + pub selected: HashSet, + pub state: ListState, + pub action_tx: Option>, + pub input: Input, + pub loading: bool, +} + +impl TopicsComponent { + pub fn new(selected_topics: Vec) -> TopicsComponent { + let loading = selected_topics.is_empty(); + let topics = selected_topics.clone(); + let index = if loading { None } else { Some(1) }; + Self { + selected: HashSet::from_iter(selected_topics), + visible_topics: topics.clone(), + topics, + state: ListState::default().with_selected(index), + loading, + ..Default::default() + } + } + + fn next(&mut self) { + if self.visible_topics.is_empty() { + return; + } + + match self.state.selected() { + Some(i) => { + if i >= self.visible_topics.len() - 1 { + self.state.select(Some(0)); + } else { + self.state.select(Some(i + 1)); + } + } + None => self.state.select(Some(0)), + }; + } + + fn previous(&mut self) { + if self.visible_topics.is_empty() { + return; + } + match self.state.selected() { + Some(i) => { + if i == 0 { + self.state.select(Some(self.visible_topics.len() - 1)); + } else { + self.state.select(Some(i - 1)); + } + } + None => self.state.select(Some(0)), + }; + } + + fn filter_topics(&mut self) { + self.visible_topics = match self.input.value().trim().is_empty() { + true => self.topics.clone(), + false => self + .topics + .clone() + .into_iter() + .filter(|t| t.contains(self.input.value())) + .collect_vec(), + }; + if self.visible_topics.is_empty() { + self.state.select(Some(0)); + } + } +} + +impl Component for TopicsComponent { + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.action_tx = Some(tx); + Ok(()) + } + + fn id(&self) -> ComponentName { + ComponentName::Topics + } + + fn handle_key_events(&mut self, key: KeyEvent) -> Result, TuiError> { + match key.code { + KeyCode::Char('p') if key.modifiers.contains(KeyModifiers::CONTROL) => { + if let Some(selected) = self.state.selected() { + self.action_tx + .as_ref() + .unwrap() + .send(Action::NewView(ComponentName::TopicDetails))?; + + let mut h = HashSet::default(); + h.insert(self.visible_topics.get(selected).unwrap().clone()); + self.action_tx + .as_ref() + .unwrap() + .send(Action::RequestTopicDetails(h))?; + } + } + KeyCode::Char('u') if key.modifiers.contains(KeyModifiers::CONTROL) => { + self.visible_topics = self.topics.clone(); + self.selected.clear(); + self.action_tx + .clone() + .unwrap() + .send(Action::RefreshShortcuts)?; + + self.filter_topics(); + self.action_tx + .clone() + .unwrap() + .send(Action::SelectedTopics(vec![]))?; + } + KeyCode::Up => self.previous(), + KeyCode::Down => self.next(), + KeyCode::Enter => { + if self.state.selected().is_none() { + return Ok(None); + } + let topic = self.visible_topics.get(self.state.selected().unwrap()); + if topic.is_none() { + return Ok(None); + } + let topic = topic.unwrap(); + if self.selected.contains(topic) { + self.selected.remove(topic); + } else { + self.selected.insert(topic.to_string()); + } + self.action_tx + .clone() + .unwrap() + .send(Action::SelectedTopics( + self.selected.clone().into_iter().collect_vec(), + ))?; + self.action_tx + .clone() + .unwrap() + .send(Action::RefreshShortcuts)?; + } + KeyCode::Esc => (), + _ => { + if !key.modifiers.contains(KeyModifiers::CONTROL) { + self.input.handle_event(&Event::Key(key)); + self.filter_topics(); + } + } + }; + Ok(None) + } + + fn update(&mut self, action: Action) -> Result, TuiError> { + if let Action::Topics(mut new_topics) = action { + self.loading = false; + new_topics.dedup(); + new_topics.sort(); + self.topics = new_topics; + if !self.topics.is_empty() { + let selected = self.state.selected().unwrap_or(0); + match selected < self.topics.len() { + true => self.state.select(Some(selected)), + false => self.state.select(Some(0)), + } + } + self.visible_topics.clone_from(&self.topics); + + for topic in self.selected.clone() { + if !self.topics.contains(&topic) { + self.selected.remove(&topic); + } + } + }; + Ok(None) + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + let is_focused = state.is_focused(self.id()); + let title = match self.selected.len() { + 0 => " Topics ".to_string(), + _ => format!(" Topics [{}] ", self.selected.len()), + }; + let outer_block = Block::default() + .padding(Padding::horizontal(1)) + .borders(Borders::ALL) + .border_type(BorderType::Rounded) + .title(title); + let outer_block = self.make_block_focused_with_state(state, outer_block); + + let items: Vec = self + .visible_topics + .iter() + .map(|i| { + let s = match self.selected.contains(i) { + true => format!("[x] {}", i), + false => format!("[ ] {}", i), + }; + ListItem::new(s).style(Style::default()) + }) + .collect(); + + let list = List::new(items).highlight_style(match is_focused { + true => Style::default() + .bg(state.theme.bg_focused_selected) + .fg(state.theme.fg_focused_selected) + .bold(), + false => Style::default() + .bg(state.theme.bg_unfocused_selected) + .fg(state.theme.fg_unfocused_selected), + }); + + let mut filter_block = Block::default() + .title(" Search ") + .padding(Padding::left(1)) + .borders(Borders::ALL) + .border_type(BorderType::Rounded) + .border_style(Style::default()); + + if is_focused { + filter_block = filter_block + .border_type(BorderType::Thick) + .title_style(Style::default().bold()) + .border_style(Style::default().fg(state.theme.dialog_border)); + } + + let filter = Paragraph::new(self.input.value()) + .style(Style::default()) + .block(filter_block); + + let inner = outer_block.inner(rect); + f.render_widget(outer_block, rect); + + match self.input.value().is_empty() { + true => f.render_stateful_widget(list, inner, &mut self.state), + false => { + let filter = filter.style(Style::default()); + let layout = Layout::default() + .direction(Direction::Vertical) + .constraints([Constraint::Percentage(100), Constraint::Min(3)]) + .split(inner); + if is_focused { + f.set_cursor_position(Position { + x: layout[1].x + (self.input.visual_cursor()) as u16 + 2, + y: layout[1].y + 1, + }); + } + f.render_stateful_widget(list, layout[0], &mut self.state); + f.render_widget(Clear, layout[1]); + f.render_widget(filter, layout[1]); + } + } + + if self.loading { + let loading = Paragraph::new("[/] Loading topics...").style(Style::default()); + f.render_widget(loading, inner); + } + + Ok(()) + } + + fn shortcuts(&self) -> Vec { + let mut shortcuts = vec![ + Shortcut::new("ENTER", "Consume topic"), + Shortcut::new("CTRL + P", "Show details"), + ]; + + if !self.selected.is_empty() { + shortcuts.push(Shortcut::new("CTRL + U", "Unselect topics")); + } + shortcuts + } +} diff --git a/crates/tui/src/component/ui.rs b/crates/tui/src/component/ui.rs new file mode 100644 index 0000000..b0e5acd --- /dev/null +++ b/crates/tui/src/component/ui.rs @@ -0,0 +1,365 @@ +//! Module gathering the code to run the terminal user interface. + +use app::search::{Search, SearchContext}; +use app::App; +use crossterm::event::KeyEvent; +use futures::{StreamExt, TryStreamExt}; +use itertools::Itertools; +use lib::KafkaRecord; +use log::{error, info, warn}; +use ratatui::prelude::Rect; +use rdkafka::consumer::{Consumer, StreamConsumer}; +use rdkafka::message::OwnedMessage; +use rdkafka::Message; +use std::collections::HashSet; +use std::fs; +use std::time::Duration; +use tokio::sync::mpsc::{self, UnboundedSender}; +use tokio::time::Instant; +use tokio::{select, time}; +use tokio_util::sync::CancellationToken; + +use crate::action::{Action, Notification}; +use crate::component::{Component, RootComponent}; +use crate::error::TuiError; +use crate::tui; + +use super::{ConcurrentRecordsBuffer, State, BUFFER}; + +pub struct Ui { + pub app: App, + pub should_quit: bool, + + pub root: RootComponent, + pub worker: CancellationToken, + pub topics: Vec, + pub last_tick_key_events: Vec, + pub last_time_consuming: Instant, + pub records_sender: Option>, + pub records: &'static ConcurrentRecordsBuffer, +} + +impl Ui { + pub async fn new( + app: App, + query: String, + selected_topics: Vec, + state: State, + ) -> Result { + Ok(Self { + should_quit: false, + worker: CancellationToken::new(), + app: app.clone(), + records: &BUFFER, + topics: vec![], + root: RootComponent::new(query, selected_topics, &app.config, &BUFFER, state), + records_sender: None, + last_tick_key_events: Vec::new(), + last_time_consuming: Instant::now(), + }) + } + + pub fn save_config(&self) -> Result<(), TuiError> { + let mut config = self.app.config.clone(); + if config.history.len() > 1000 { + config.history = config.history.into_iter().skip(500).collect(); + } + fs::write( + &self.app.config.path, + serde_json::to_string_pretty(&self.app.config)?, + )?; + Ok(()) + } + + pub async fn create_consumer( + app: &App, + topics: Vec, + tx: UnboundedSender, + ) -> Result { + match app.create_consumer(&topics) { + Ok(c) => Ok(c), + Err(e) => { + tx.send(Action::Notification(Notification::new( + log::Level::Error, + e.to_string(), + )))?; + error!("Something went wrong when trying to consume topics: {}", e); + Err(e.into()) + } + } + } + + pub async fn consume_topics(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.worker.cancel(); + self.records.lock().unwrap().reset(); + if self.topics.is_empty() { + tx.send(Action::StopConsuming())?; + return Ok(()); + } + self.worker = CancellationToken::new(); + + let query = self.app.search_query.query().clone(); + let order_by = query.order_by.clone(); + tx.send(Action::OrderBy(order_by.clone()))?; + tx.send(Action::NewConsumer())?; + tx.send(Action::Consuming)?; + let r = self.records; + let token = self.worker.clone(); + tokio::spawn(async move { + while !token.is_cancelled() { + r.lock().unwrap().sort(&order_by); + let mut interval = time::interval(Duration::from_secs(1)); + interval.tick().await; + } + }); + let r = self.records; + let token = self.worker.clone(); + let search_query = self.app.search_query.query().clone(); + let app = self.app.clone(); + let txx = tx.clone(); + let topics = self.topics.clone(); + + let (tx_dd, mut rx_dd) = mpsc::unbounded_channel::(); + let mut schema_registry = app.schema_registry().clone(); + let token_cloned = token.clone(); + tokio::spawn(async move { + loop { + select! { + _ = token_cloned.cancelled() => { + info!("Consumer is about to be cancelled"); + return; + }, + Some(message) = rx_dd.recv() => { + let record = KafkaRecord::parse(message, &mut schema_registry).await; + let context = SearchContext::new(&record); + let mut ll = r.lock().unwrap(); + ll.new_record_read(); + if search_query.matches(&context) { + ll.push(record); + } + ll.dispatch_metrics(); + if let Some(limit) = query.limit { + if Some(ll.matched_and_read().0) >= Some(limit) { + token_cloned.cancel(); + } + } + } + } + } + }); + + tokio::spawn(async move { + let _ = tx.send(Action::Consuming); + let consumer = match Self::create_consumer(&app, topics.clone(), txx.clone()).await { + Ok(c) => c, + Err(e) => { + let _ = tx.send(Action::StopConsuming()); + warn!("I was not able to create a consumer: {}", e); + return Err("I was not able to create a consumer after 5 attempts..."); + } + }; + let _ = tx.send(Action::Consuming); + let assignments = consumer.assignment().unwrap(); + let txx = tx.clone(); + tokio::spawn(async move { + let count = app + .estimate_number_of_records_to_read(assignments) + .unwrap_or(0); + let _ = txx.send(Action::RecordsToRead(count as usize)); + }); + let mut current_time = Instant::now(); + let _ = consumer + .stream() + .take_until(token.cancelled()) + .try_for_each(|message| { + let message = message.detach(); + let timestamp = message.timestamp().to_millis().unwrap_or_default(); + tx_dd.send(message).unwrap(); + if current_time.elapsed() > Duration::from_secs(20) { + current_time = Instant::now(); + tx.send(Action::Notification(Notification::new( + log::Level::Info, + format!("Checkpoint: {}", timestamp), + ))) + .unwrap(); + } + futures::future::ok(()) + }) + .await; + consumer.unassign().unwrap(); + info!("Consumer is terminated"); + token.cancel(); + r.lock().unwrap().sort(&query.order_by); + let _ = tx.send(Action::StopConsuming()); + Ok(()) + }); + Ok(()) + } + + pub fn topics_details( + &mut self, + topics: HashSet, + action_tx: UnboundedSender, + ) -> Result<(), TuiError> { + let app = self.app.clone(); + tokio::spawn(async move { + info!("Loading topics"); + match app.topic_details(topics) { + Ok(details) => action_tx.send(Action::TopicDetails(details)).unwrap(), + Err(e) => action_tx + .send(Action::Notification(Notification::new( + log::Level::Error, + e.to_string(), + ))) + .unwrap(), + } + }); + Ok(()) + } + + pub fn export_record( + &mut self, + record: &KafkaRecord, + action_tx: UnboundedSender, + ) -> Result<(), TuiError> { + self.app.export_record(record)?; + action_tx.send(Action::Notification(Notification::new( + log::Level::Info, + "Record exported to the file".to_string(), + )))?; + Ok(()) + } + + pub fn load_topics(&mut self, action_tx: UnboundedSender) -> Result<(), TuiError> { + let app = self.app.clone(); + tokio::spawn(async move { + info!("Loading topics"); + match app.list_topics() { + Ok(topics) => { + action_tx.send(Action::Topics(topics)).unwrap(); + } + Err(e) => { + action_tx + .send(Action::Notification(Notification::new( + log::Level::Error, + e.to_string(), + ))) + .unwrap(); + error!("Something went wrong when trying to list topics: {}", e) + } + } + }); + Ok(()) + } + + pub async fn run(&mut self, topics: Vec, state: State) -> Result<(), TuiError> { + let (action_tx, mut action_rx) = mpsc::unbounded_channel(); + let records_channel = mpsc::unbounded_channel::(); + self.records_sender = Some(records_channel.0); + self.load_topics(action_tx.clone())?; + let mut tui = tui::Tui::new()?; + tui.enter()?; + self.records + .lock() + .unwrap() + .register_action_handler(action_tx.clone()); + self.root.register_action_handler(action_tx.clone())?; + self.root.init()?; + action_tx.send(Action::SelectedTopics(topics))?; + loop { + if let Some(e) = tui.next().await { + match e { + tui::Event::Quit => action_tx.send(Action::Quit)?, + tui::Event::Tick => action_tx.send(Action::Tick)?, + tui::Event::Render => action_tx.send(Action::Render)?, + tui::Event::Resize(x, y) => action_tx.send(Action::Resize(x, y))?, + _ => {} + }; + + if let Some(action) = self.root.handle_events(Some(e.clone()))? { + action_tx.send(action)?; + } + } + while let Ok(action) = action_rx.try_recv() { + match action { + Action::NewConfig(ref config) => { + self.app.config = config.clone(); + self.save_config()?; + } + Action::NewSearchPrompt(ref prompt) => { + self.app.config.history.push(prompt.to_string()); + self.app.config.history.dedup(); + self.save_config()?; + } + Action::RequestTopicDetails(ref topics) => { + self.topics_details(topics.clone(), action_tx.clone())?; + } + Action::Tick => { + self.last_tick_key_events.drain(..); + } + Action::Refresh => { + self.load_topics(action_tx.clone())?; + action_tx.send(Action::Notification(Notification::new( + log::Level::Info, + "Refreshing topics".to_string(), + )))?; + } + Action::Quit => { + self.worker.cancel(); + self.should_quit = true; + } + Action::Open(ref record) => { + let url = self + .app + .config + .url_template_of(&state.cluster) + .replace("{topic}", &record.topic) + .replace("{partition}", &record.partition.to_string()) + .replace("{offset}", &record.offset.to_string()); + + if let Err(e) = open::that(&url) { + action_tx.send(Action::Notification(Notification::new( + log::Level::Info, + "this action is not available right now".to_string(), + )))?; + warn!("Cannot open the URL '{}': {}", url, e) + } + } + Action::Resize(w, h) => { + tui.resize(Rect::new(0, 0, w, h))?; + tui.draw(|f| { + let _ = self.root.draw(f, f.area(), &state); + })?; + } + Action::Export(ref record) => { + self.export_record(record, action_tx.clone())?; + } + Action::Render => { + tui.draw(|f| { + let _ = self.root.draw(f, f.area(), &state); + })?; + } + Action::SelectedTopics(ref topics) => { + self.topics = topics.iter().map(|t| t.into()).collect_vec(); + self.consume_topics(action_tx.clone()).await?; + } + Action::Search(ref search) => { + self.app.search_query = search.clone(); + self.consume_topics(action_tx.clone()).await?; + } + _ => {} + } + + if let Some(action) = self.root.update(action.clone())? { + action_tx.send(action.clone())? + }; + } + if self.should_quit { + tui.stop()?; + break; + } + } + tui.exit()?; + Ok(()) + } +} diff --git a/crates/tui/src/component/vertical_scrollable_block.rs b/crates/tui/src/component/vertical_scrollable_block.rs new file mode 100644 index 0000000..2dde1f6 --- /dev/null +++ b/crates/tui/src/component/vertical_scrollable_block.rs @@ -0,0 +1,133 @@ +use super::{Component, Shortcut, WithHeight}; +use crossterm::event::{KeyCode, KeyEvent, MouseEvent}; +use ratatui::{ + layout::{Margin, Rect}, + widgets::{Block, Scrollbar, ScrollbarOrientation, ScrollbarState}, + Frame, +}; +use tokio::sync::mpsc::UnboundedSender; + +use crate::{action::Action, error::TuiError, tui::Event}; + +use super::{ComponentName, State}; + +#[derive(Default)] +pub struct VerticalScrollableBlock { + scroll: u16, + scroll_length: u16, + scrollbar_state: ScrollbarState, + component: C, +} + +impl VerticalScrollableBlock +where + C: WithHeight, +{ + #[allow(dead_code)] + pub fn new(component: C) -> Self { + Self { + scroll: 0, + scroll_length: 10, + scrollbar_state: ScrollbarState::new(component.content_height()), + component, + } + } +} + +impl Component for VerticalScrollableBlock +where + C: WithHeight, +{ + #[allow(unused_variables)] + fn register_action_handler(&mut self, tx: UnboundedSender) -> Result<(), TuiError> { + self.component.register_action_handler(tx) + } + + fn id(&self) -> ComponentName { + self.component.id() + } + + fn make_block_focused_with_state<'a>(&self, state: &State, block: Block<'a>) -> Block<'a> { + self.component.make_block_focused_with_state(state, block) + } + + fn make_block_focused<'a>(&self, state: &State, block: Block<'a>) -> Block<'a> { + self.component.make_block_focused(state, block) + } + + fn init(&mut self) -> Result<(), TuiError> { + self.component.init() + } + + fn handle_events(&mut self, event: Option) -> Result, TuiError> { + let r = match event { + Some(Event::Key(key_event)) => self.handle_key_events(key_event)?, + Some(Event::Mouse(mouse_event)) => self.handle_mouse_events(mouse_event)?, + _ => None, + }; + Ok(r) + } + + fn handle_key_events(&mut self, key: KeyEvent) -> Result, TuiError> { + match key.code { + KeyCode::Char('k') | KeyCode::Down => { + self.scroll = (self.scroll + 1).min(self.scroll_length); + } + KeyCode::Char('j') | KeyCode::Up => { + self.scroll = self.scroll.saturating_sub(1); + } + KeyCode::Char('[') => { + self.scroll = 0; + } + KeyCode::Char(']') => { + self.scroll = self.scroll_length; + } + _ => { + self.component.handle_key_events(key)?; + } + } + Ok(None) + } + + fn handle_mouse_events(&mut self, mouse: MouseEvent) -> Result, TuiError> { + self.component.handle_mouse_events(mouse) + } + + fn update(&mut self, action: Action) -> Result, TuiError> { + self.component.update(action) + } + + fn shortcuts(&self) -> Vec { + self.component.shortcuts() + } + + fn draw(&mut self, f: &mut Frame<'_>, rect: Rect, state: &State) -> Result<(), TuiError> { + self.scrollbar_state = self.scrollbar_state.content_length(0); + let content_height = self.component.content_height() as u16; + if rect.height < content_height { + self.scroll_length = content_height - rect.height + 2; + self.scrollbar_state = self + .scrollbar_state + .content_length(self.scroll_length as usize) + .position(self.scroll as usize); + } else { + self.scrollbar_state = self.scrollbar_state.content_length(0); + self.scroll = 0; + } + + let scrollbar = Scrollbar::new(ScrollbarOrientation::VerticalRight) + .begin_symbol(Some("▲")) + .end_symbol(Some("▼")); + + self.component.draw(f, rect, state)?; + f.render_stateful_widget( + scrollbar, + rect.inner(Margin { + vertical: 1, + horizontal: 0, + }), + &mut self.scrollbar_state, + ); + Ok(()) + } +} diff --git a/crates/tui/src/error.rs b/crates/tui/src/error.rs new file mode 100644 index 0000000..508223c --- /dev/null +++ b/crates/tui/src/error.rs @@ -0,0 +1,66 @@ +use lib::Error; +use std::{ + fmt::{self, Display, Formatter}, + num::TryFromIntError, +}; +use tokio::sync::mpsc::error::SendError; + +/// Wrapper of around the `lib::Error` struct. +//#[derive(Debug)] +pub struct TuiError(Error); + +impl From> for TuiError { + fn from(e: SendError) -> Self { + TuiError(Error::Error(e.to_string())) + } +} + +impl Display for TuiError { + fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result { + write!(f, "{}", self.0) + } +} + +use std::fmt::Debug; + +impl Debug for TuiError { + fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result { + write!(f, "{}", self.0) + } +} + +impl From<&str> for TuiError { + fn from(e: &str) -> Self { + TuiError(Error::Error(e.to_string())) + } +} + +impl From for TuiError { + fn from(e: serde_json::Error) -> Self { + TuiError(Error::SerdeError(e)) + } +} + +impl From for TuiError { + fn from(e: Error) -> Self { + TuiError(e) + } +} + +impl From for Error { + fn from(val: TuiError) -> Self { + val.0 + } +} + +impl From for TuiError { + fn from(e: std::io::Error) -> Self { + TuiError(Error::IoError(e)) + } +} + +impl From for TuiError { + fn from(e: TryFromIntError) -> Self { + TuiError(Error::Error(e.to_string())) + } +} diff --git a/crates/tui/src/lib.rs b/crates/tui/src/lib.rs new file mode 100644 index 0000000..3b08736 --- /dev/null +++ b/crates/tui/src/lib.rs @@ -0,0 +1,15 @@ +//! This library contains all the glue code with [Ratatui](https://github.com/ratatui/ratatui). + +mod action; +mod component; +pub mod error; +mod records_buffer; +pub mod theme; +mod tui; +pub use action::Action; +pub use action::Notification; + +pub use component::State; +pub use component::Ui; +pub use error::TuiError; +pub use theme::Theme; diff --git a/crates/tui/src/records_buffer.rs b/crates/tui/src/records_buffer.rs new file mode 100644 index 0000000..b0d0557 --- /dev/null +++ b/crates/tui/src/records_buffer.rs @@ -0,0 +1,156 @@ +//! It uses a ring buffer to store the kafka records. +//! At the time writing, the tool stores the `[BUFFER_SIZE]` last records. +//! +//! This should be possible to increase the size but the more you display events, +//! the more the tool gets laggy. I need to work on it. + +use circular_buffer::{CircularBuffer, Iter}; +use lib::{ + search::{Order, OrderBy}, + KafkaRecord, +}; +use rayon::prelude::*; +use tokio::sync::{ + mpsc::UnboundedSender, + watch::{self, Receiver, Sender}, +}; + +use crate::action::Action; + +/// Size of the ring buffer +const BUFFER_SIZE: usize = 500; + +/// Wrapper around [CircularBuffer] +pub struct RecordsBuffer { + buffer: CircularBuffer, + tx_action: Option>, + read: usize, + pub channels: (Sender, Receiver), + last_time_sorted: usize, + matched: usize, +} + +macro_rules! sort_records { + ($array:ident, $field: ident, $reverse: expr) => { + $array.par_sort_by(|a, b| { + let mut ordering = a.$field.cmp(&b.$field); + if $reverse { + ordering = ordering.reverse(); + } + ordering + }) + }; +} + +impl Default for RecordsBuffer { + fn default() -> Self { + Self::new() + } +} + +impl RecordsBuffer { + pub fn new() -> Self { + Self { + buffer: CircularBuffer::::new(), + read: 0, + channels: watch::channel(BufferAction::Count((0, 0, 0))), + matched: 0, + last_time_sorted: 0, + tx_action: None, + } + } + + pub fn register_action_handler(&mut self, tx: UnboundedSender) { + self.tx_action = Some(tx); + } + + pub fn is_empty(&self) -> bool { + self.buffer.is_empty() + } + + /// Empty the buffer and reset metrics + pub fn reset(&mut self) { + self.buffer.clear(); + self.read = 0; + self.matched = 0; + self.dispatch_metrics(); + } + + /// Returns the metrics of the number of records matched and read. + pub fn matched_and_read(&self) -> (usize, usize, usize) { + (self.matched, self.read, self.buffer.len()) + } + + /// Updates the metric regarding the number of kafka records read + pub fn new_record_read(&mut self) { + self.read += 1; + } + + pub fn len(&self) -> usize { + self.buffer.len() + } + + pub fn get(&self, index: usize) -> Option<&KafkaRecord> { + self.buffer.get(index) + } + + pub fn iter(&self) -> Iter { + self.buffer.iter() + } + + pub fn push(&mut self, kafka_record: KafkaRecord) -> usize { + self.buffer.push_back(kafka_record); + self.matched += 1; + self.matched + } + + /// Dispatches a new events about the metrics of the buffer + pub fn dispatch_metrics(&mut self) { + self.channels + .0 + .send(BufferAction::Count(self.matched_and_read())) + .unwrap(); + } + + /// Sort the buffer by the given order + pub fn sort(&mut self, order_by: &OrderBy) { + let mut unsorted = self.buffer.to_vec(); + if self.read == self.last_time_sorted { + return; + } + let reverse = order_by.is_descending(); + match order_by.order { + Order::Timestamp => { + sort_records!(unsorted, timestamp, reverse) + } + Order::Key => { + sort_records!(unsorted, key_as_string, reverse) + } + Order::Value => sort_records!(unsorted, value_as_string, reverse), + Order::Partition => { + sort_records!(unsorted, partition, reverse) + } + Order::Offset => { + sort_records!(unsorted, offset, reverse) + } + Order::Size => unsorted.sort_by(|a, b| { + let mut ordering = a.size.cmp(&b.size); + if order_by.is_descending() { + ordering = ordering.reverse(); + } + ordering + }), + Order::Topic => { + sort_records!(unsorted, topic, reverse) + } + } + self.buffer.clear(); + self.buffer.extend(unsorted) + } +} + +#[allow(clippy::large_enum_variant)] +#[derive(Debug, Clone, PartialEq, Eq)] +pub enum BufferAction { + Count((usize, usize, usize)), +} diff --git a/crates/tui/src/theme.rs b/crates/tui/src/theme.rs new file mode 100644 index 0000000..9aba913 --- /dev/null +++ b/crates/tui/src/theme.rs @@ -0,0 +1,63 @@ +//! Theme for the TUI + +use ratatui::style::Color; +use serde::{Deserialize, Serialize}; + +/// A `Theme` contains all the colors +/// to make the TUI pretty. +#[derive(Clone, Debug, PartialEq, Eq, Hash, Serialize, Deserialize)] +pub struct Theme { + pub name: String, + pub fg: Color, + pub bg: Color, + pub black: Color, + pub red: Color, + pub green: Color, + pub yellow: Color, + pub blue: Color, + pub magenta: Color, + pub cyan: Color, + pub white: Color, + pub orange: Color, + pub focused_border: Color, + pub bg_focused_selected: Color, + pub fg_focused_selected: Color, + pub bg_unfocused_selected: Color, + pub fg_unfocused_selected: Color, + pub bg_disabled: Color, + pub fg_disabled: Color, + pub bg_active: Color, + pub fg_active: Color, + pub dialog_border: Color, + pub autocomplete: Color, +} + +impl Theme { + pub fn light() -> Self { + Self { + name: "light".to_string(), + fg: Color::Black, + bg: Color::White, + black: Color::Black, + red: Color::Red, + green: Color::Green, + yellow: Color::Yellow, + blue: Color::Blue, + magenta: Color::Magenta, + cyan: Color::Cyan, + white: Color::White, + orange: Color::LightRed, + focused_border: Color::Blue, + bg_focused_selected: Color::Black, + fg_focused_selected: Color::White, + bg_unfocused_selected: Color::White, + fg_unfocused_selected: Color::default(), + dialog_border: Color::Yellow, + autocomplete: Color::Rgb(100, 100, 100), + bg_disabled: Color::Indexed(7), + fg_disabled: Color::Black, + bg_active: Color::Green, + fg_active: Color::Black, + } + } +} diff --git a/crates/tui/src/tui.rs b/crates/tui/src/tui.rs new file mode 100644 index 0000000..8192e59 --- /dev/null +++ b/crates/tui/src/tui.rs @@ -0,0 +1,213 @@ +use std::{ + ops::{Deref, DerefMut}, + panic::{set_hook, take_hook}, +}; + +use crossterm::{ + cursor, + event::{Event as CrosstermEvent, KeyEvent, MouseEvent}, + terminal::{EnterAlternateScreen, LeaveAlternateScreen}, +}; +use futures::{FutureExt, StreamExt}; +use lib::Error; +use ratatui::backend::CrosstermBackend as Backend; +use tokio::{ + sync::mpsc::{self, UnboundedReceiver, UnboundedSender}, + task::JoinHandle, +}; +use tokio_util::sync::CancellationToken; + +#[derive(Clone, Debug)] +pub enum Event { + Init, + #[allow(dead_code)] + Quit, + Error, + Tick, + Render, + FocusGained, + FocusLost, + Paste, + Key(KeyEvent), + Mouse(MouseEvent), + Resize(u16, u16), +} + +pub struct Tui { + pub terminal: ratatui::Terminal>, + pub task: JoinHandle<()>, + pub cancellation_token: CancellationToken, + pub event_rx: UnboundedReceiver, + pub event_tx: UnboundedSender, + pub frame_rate: f64, + pub tick_rate: f64, +} + +impl Tui { + pub fn new() -> Result { + let tick_rate = 4.0; + let frame_rate = 30.0; + let terminal = ratatui::Terminal::new(Backend::new(std::io::stderr()))?; + let (event_tx, event_rx) = mpsc::unbounded_channel(); + let cancellation_token = CancellationToken::new(); + let task = tokio::spawn(async {}); + Ok(Self { + terminal, + task, + cancellation_token, + event_rx, + event_tx, + frame_rate, + tick_rate, + }) + } + + pub fn start(&mut self) { + let tick_delay = std::time::Duration::from_secs_f64(1.0 / self.tick_rate); + let render_delay = std::time::Duration::from_secs_f64(1.0 / self.frame_rate); + self.cancel(); + self.cancellation_token = CancellationToken::new(); + let _cancellation_token = self.cancellation_token.clone(); + let event_tx = self.event_tx.clone(); + self.task = tokio::spawn(async move { + let mut reader = crossterm::event::EventStream::new(); + let mut tick_interval = tokio::time::interval(tick_delay); + let mut render_interval = tokio::time::interval(render_delay); + event_tx.send(Event::Init).unwrap(); + loop { + let tick_delay = tick_interval.tick(); + let render_delay = render_interval.tick(); + let crossterm_event = reader.next().fuse(); + tokio::select! { + _ = _cancellation_token.cancelled() => { + break; + } + maybe_event = crossterm_event => { + + match maybe_event { + Some(Ok(evt)) => { + match evt { + CrosstermEvent::Key(key) => { + event_tx.send(Event::Key(key)).unwrap(); + }, + CrosstermEvent::Mouse(mouse) => { + event_tx.send(Event::Mouse(mouse)).unwrap(); + }, + CrosstermEvent::Resize(x, y) => { + event_tx.send(Event::Resize(x, y)).unwrap(); + }, + CrosstermEvent::FocusLost => { + event_tx.send(Event::FocusLost).unwrap(); + }, + CrosstermEvent::FocusGained => { + event_tx.send(Event::FocusGained).unwrap(); + }, + CrosstermEvent::Paste(_) => { + event_tx.send(Event::Paste).unwrap(); + } + } + } + Some(Err(_)) => { + event_tx.send(Event::Error).unwrap(); + } + None => {}, + } + }, + _ = tick_delay => { + event_tx.send(Event::Tick).unwrap(); + }, + _ = render_delay => { + event_tx.send(Event::Render).unwrap(); + }, + } + } + }); + } + + pub fn stop(&self) -> Result<(), Error> { + self.cancel(); + Ok(()) + } + + pub fn init_panic_hook(&self) { + let original_hook = take_hook(); + set_hook(Box::new(move |panic_info| { + Self::restore_tui().unwrap(); + original_hook(panic_info); + })); + } + + pub fn enter(&mut self) -> Result<(), Error> { + self.init_panic_hook(); + crossterm::terminal::enable_raw_mode()?; + crossterm::execute!( + std::io::stderr(), + EnterAlternateScreen, + //EnableMouseCapture, + cursor::Hide, + //PushKeyboardEnhancementFlags( + // KeyboardEnhancementFlags::DISAMBIGUATE_ESCAPE_CODES + // | KeyboardEnhancementFlags::REPORT_ALL_KEYS_AS_ESCAPE_CODES + //) + )?; + self.start(); + Ok(()) + } + + pub fn exit(&mut self) -> Result<(), Error> { + self.stop()?; + if crossterm::terminal::is_raw_mode_enabled()? { + self.flush()?; + crossterm::execute!( + std::io::stderr(), + LeaveAlternateScreen, + // DisableMouseCapture, + cursor::Show, + //PopKeyboardEnhancementFlags + )?; + crossterm::terminal::disable_raw_mode()?; + } + Ok(()) + } + + pub fn restore_tui() -> Result<(), Error> { + if crossterm::terminal::is_raw_mode_enabled()? { + crossterm::execute!( + std::io::stderr(), + LeaveAlternateScreen, + //DisableMouseCapture, + cursor::Show + )?; + crossterm::terminal::disable_raw_mode()?; + } + Ok(()) + } + + pub fn cancel(&self) { + self.cancellation_token.cancel(); + } + + pub async fn next(&mut self) -> Option { + self.event_rx.recv().await + } +} + +impl Deref for Tui { + type Target = ratatui::Terminal>; + + fn deref(&self) -> &Self::Target { + &self.terminal + } +} + +impl DerefMut for Tui { + fn deref_mut(&mut self) -> &mut Self::Target { + &mut self.terminal + } +} + +impl Drop for Tui { + fn drop(&mut self) { + self.exit().unwrap(); + } +} diff --git a/crates/wasm-blueprints/README.md b/crates/wasm-blueprints/README.md new file mode 100644 index 0000000..d276aa4 --- /dev/null +++ b/crates/wasm-blueprints/README.md @@ -0,0 +1,20 @@ +
+ logo of WebAssembly +

Creating a search filter.

+
+ +Refer to [the documentation](../../docs/search-filter/README.md) for more details. 2 blueprints are available to implement your search filter: + - Rust + - Golang + +Your favorite programming language is not listed above? Feel free to contribute with another blueprint. Take a look at [Extism](https://extism.org/) to see if it supports your language. + + +## Creating a new blueprint + +For a good develop experience, please respect the following rules: + 1. The blueprint must implement the `key-ends-with` search filter as an example. + 1. A `Makefile` must be present. Feel free to copy/paste one of [the existing ones](./rust/Makefile) and adapt it. + 1. Running `make build` must create a wasm file named `module.wasm`. + 1. A `README.md` must be present. You can also take inspiration from [the existing ones](./rust/README.md). + 1. The `Makefile` must include a `test` recipes running some basic tests. Feel free to copy/paste one of [the existing ones](./rust/Makefile) and adapt it. diff --git a/crates/wasm-blueprints/golang/.dockerignore b/crates/wasm-blueprints/golang/.dockerignore new file mode 100644 index 0000000..c25e012 --- /dev/null +++ b/crates/wasm-blueprints/golang/.dockerignore @@ -0,0 +1,6 @@ +Dockerfile +target +README.md +.git +.gitignore +module.wasm \ No newline at end of file diff --git a/crates/wasm-blueprints/golang/.gitignore b/crates/wasm-blueprints/golang/.gitignore new file mode 100644 index 0000000..917660a --- /dev/null +++ b/crates/wasm-blueprints/golang/.gitignore @@ -0,0 +1 @@ +*.wasm \ No newline at end of file diff --git a/crates/wasm-blueprints/golang/Dockerfile b/crates/wasm-blueprints/golang/Dockerfile new file mode 100644 index 0000000..e7b73fe --- /dev/null +++ b/crates/wasm-blueprints/golang/Dockerfile @@ -0,0 +1,8 @@ +FROM tinygo/tinygo:latest + +USER root +RUN apt update && apt install make +USER tinygo +WORKDIR /tmp/build +COPY --chown=tinygo . . +RUN make module.wasm \ No newline at end of file diff --git a/crates/wasm-blueprints/golang/Makefile b/crates/wasm-blueprints/golang/Makefile new file mode 100644 index 0000000..64bb45f --- /dev/null +++ b/crates/wasm-blueprints/golang/Makefile @@ -0,0 +1,32 @@ +.DEFAULT_GOAL = help +TARGET = module.wasm +.PHONY: help $(TARGET) +DOCKER_IMAGE = yozefu-wasm-blueprints-golang + +build: ## Try to build the wasm module. If it fails, it tries to build with Docker + make -S $(TARGET) || make -S build-from-docker + +build-from-docker: $(TARGET) ## Build the wasm module with Docker + @echo " 🐋 Trying to build wasm module with docker" + docker build -t $(DOCKER_IMAGE) . + docker create --name $(DOCKER_IMAGE) $(DOCKER_IMAGE) + docker cp $(DOCKER_IMAGE):/tmp/build/$(TARGET) . + docker rm -v $(DOCKER_IMAGE) + +$(TARGET): ## Build the wasm module + go mod tidy + tinygo build -o $@ -no-debug -opt=2 -target wasi main.go + +test: $(TARGET) ## Run the tests + cat "tests/parameters.json" | extism call --wasi --stdin $(TARGET) parse_parameters + cat "tests/match.json" | extism call --wasi --stdin $(TARGET) matches | grep -q '"match":true' + cat "tests/no-match.json" | extism call --wasi --stdin $(TARGET) matches | grep -q '"match":false' + +clean: ## Clean the wasm file and the target directory + rm -f $(TARGET) + +help: ## Show this help + @echo "Variables:" + @make -pnf $(MAKEFILE_LIST) | awk '/^# (makefile |command)/{getline; print}' | grep -v "^MAKEFILE_LIST" | sort | uniq | awk 'BEGIN {FS = ":?= "}; {printf " \033[36m%-30s\033[0m %s\n", $$1, $$2}' + @echo "\nTargets:" + @grep -E '^[/%a-zA-Z0-9_-]+: .*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ": .*?## "}; {printf " \033[36m%-30s\033[0m %s\n", $$1, $$2}' \ No newline at end of file diff --git a/crates/wasm-blueprints/golang/README.md b/crates/wasm-blueprints/golang/README.md new file mode 100644 index 0000000..87695c3 --- /dev/null +++ b/crates/wasm-blueprints/golang/README.md @@ -0,0 +1,11 @@ +# Golang search filter + +Link explaining how to write a search filter +Link to Extism Golang PDK + +Blueprint project to write a search filter in Golang. +To build the WebAssembly module: +```bash +make build +yozf import-filter module.wasm --name 'key-ends-with' +``` \ No newline at end of file diff --git a/crates/wasm-blueprints/golang/go.mod b/crates/wasm-blueprints/golang/go.mod new file mode 100644 index 0000000..81fa7a7 --- /dev/null +++ b/crates/wasm-blueprints/golang/go.mod @@ -0,0 +1,5 @@ +module golang + +go 1.23.3 + +require github.com/extism/go-pdk v1.1.0 diff --git a/crates/wasm-blueprints/golang/main.go b/crates/wasm-blueprints/golang/main.go new file mode 100644 index 0000000..047f2b7 --- /dev/null +++ b/crates/wasm-blueprints/golang/main.go @@ -0,0 +1,72 @@ +// https://extism.org/docs/quickstart/plugin-quickstart + +package main + +import ( + "errors" + "fmt" + "strings" + + "github.com/extism/go-pdk" +) + +type KafkaRecord struct { + Value string `json:"value"` + Key string `json:"key"` + Topic string `json:"topic"` + Timestamp int64 `json:"timestamp"` + Partition int `json:"partition"` + Offset int `json:"offset"` + Headers interface{} `json:"headers"` +} + +type FilterParams = []string + +type FilterInput struct { + Record KafkaRecord `json:"record"` + Params FilterParams `json:"params"` +} + +type FilterResult struct { + Match bool `json:"match"` +} + +//export matches +func Matches() int32 { + // TODO - Edit the code as per your requirements + input := FilterInput{} + err := pdk.InputJSON(&input) + param := input.Params[0] + if err != nil { + pdk.SetError(err) + return 1 + } + if strings.HasSuffix(input.Record.Key, param) { + err = pdk.OutputJSON(FilterResult{Match: true}) + if err != nil { + pdk.SetError(err) + return 1 + } + return 0 + } + _ = pdk.OutputJSON(FilterResult{Match: false}) + return 0 +} + +//export parse_parameters +func ParseParameters() int32 { + // TODO - Edit the code as per your requirements + var params FilterParams + err := pdk.InputJSON(¶ms) + if err != nil { + pdk.SetError(err) + return 1 + } + if len(params) != 1 { + pdk.SetError(errors.New(fmt.Sprintf("This search filter expects a string argument. Found %v arguments", len(params)))) + return 1 + } + return 0 +} + +func main() {} diff --git a/crates/wasm-blueprints/golang/tests/match.json b/crates/wasm-blueprints/golang/tests/match.json new file mode 100644 index 0000000..4a57d7c --- /dev/null +++ b/crates/wasm-blueprints/golang/tests/match.json @@ -0,0 +1,14 @@ +{ + "record": { + "value": "", + "key": "21965", + "topic": "public-french-addresses", + "timestamp": 1732479526752, + "partition": 0, + "offset": 4, + "headers": {} + }, + "params": [ + "965" + ] +} \ No newline at end of file diff --git a/crates/wasm-blueprints/golang/tests/no-match.json b/crates/wasm-blueprints/golang/tests/no-match.json new file mode 100644 index 0000000..1b6b8bf --- /dev/null +++ b/crates/wasm-blueprints/golang/tests/no-match.json @@ -0,0 +1,14 @@ +{ + "record": { + "value": "", + "key": "21965", + "topic": "public-french-addresses", + "timestamp": 1732479526752, + "partition": 0, + "offset": 4, + "headers": {} + }, + "params": [ + "easy-cookie" + ] +} \ No newline at end of file diff --git a/crates/wasm-blueprints/golang/tests/parameters.json b/crates/wasm-blueprints/golang/tests/parameters.json new file mode 100644 index 0000000..c501502 --- /dev/null +++ b/crates/wasm-blueprints/golang/tests/parameters.json @@ -0,0 +1 @@ +["a-key-suffix"] \ No newline at end of file diff --git a/crates/wasm-blueprints/rust/.cargo/config.toml b/crates/wasm-blueprints/rust/.cargo/config.toml new file mode 100644 index 0000000..435ed75 --- /dev/null +++ b/crates/wasm-blueprints/rust/.cargo/config.toml @@ -0,0 +1,2 @@ +[build] +target = "wasm32-unknown-unknown" \ No newline at end of file diff --git a/crates/wasm-blueprints/rust/.dockerignore b/crates/wasm-blueprints/rust/.dockerignore new file mode 100644 index 0000000..9fc6324 --- /dev/null +++ b/crates/wasm-blueprints/rust/.dockerignore @@ -0,0 +1,7 @@ +Dockerfile +target +README.md +.git +.gitignore +.dockerignore +module.wasm \ No newline at end of file diff --git a/crates/wasm-blueprints/rust/.gitignore b/crates/wasm-blueprints/rust/.gitignore new file mode 100644 index 0000000..719a931 --- /dev/null +++ b/crates/wasm-blueprints/rust/.gitignore @@ -0,0 +1,3 @@ +/target +.vscode +**/*DS_Store \ No newline at end of file diff --git a/crates/wasm-blueprints/rust/Cargo.toml b/crates/wasm-blueprints/rust/Cargo.toml new file mode 100644 index 0000000..649a4ed --- /dev/null +++ b/crates/wasm-blueprints/rust/Cargo.toml @@ -0,0 +1,15 @@ +[package] +name = "yozefu-wasm-blueprints-rust" +version = "0.1.0" +edition = "2021" + +[dependencies] +extism-pdk = "1.3.0" +yozefu-wasm-types = { git = "ssh://git@github.com/MAIF/yozefu.git", branch="main" } +#yozefu-wasm-types = { path = "../../wasm-types" } + + +[workspace] + +[lib] +crate-type = ["cdylib"] diff --git a/crates/wasm-blueprints/rust/Dockerfile b/crates/wasm-blueprints/rust/Dockerfile new file mode 100644 index 0000000..8307a67 --- /dev/null +++ b/crates/wasm-blueprints/rust/Dockerfile @@ -0,0 +1,6 @@ +FROM rust:alpine + +RUN apk update --no-cache && apk add make +WORKDIR /tmp/build +COPY . . +RUN make module.wasm \ No newline at end of file diff --git a/crates/wasm-blueprints/rust/Makefile b/crates/wasm-blueprints/rust/Makefile new file mode 100644 index 0000000..734f517 --- /dev/null +++ b/crates/wasm-blueprints/rust/Makefile @@ -0,0 +1,35 @@ +.DEFAULT_GOAL = help +TARGET = module.wasm +.PHONY: help $(TARGET) +DOCKER_IMAGE = yozefu-wasm-blueprints-rust + +build: ## Try to build the wasm module. If it fails, it tries to build with Docker + make -S $(TARGET) || make -S build-from-docker + +build-from-docker: ## Build the wasm module with Docker + @echo " 🐋 Trying to build wasm module with docker" + docker build -t $(DOCKER_IMAGE) . + docker create --name $(DOCKER_IMAGE) $(DOCKER_IMAGE) + docker cp $(DOCKER_IMAGE):/tmp/build/$(TARGET) . + docker rm -v $(DOCKER_IMAGE) + +$(TARGET): ## Build the wasm module + rustup target add wasm32-unknown-unknown + cargo build --release --target wasm32-unknown-unknown + cp target/wasm32-unknown-unknown/release/yozefu_wasm_blueprints_rust.wasm $@ + +test: $(TARGET) ## Run the tests + cat "tests/parameters.json" | extism call --stdin $(TARGET) parse_parameters + cat "tests/match.json" | extism call --stdin $(TARGET) matches | grep -q '"match":true' + cat "tests/no-match.json" | extism call --stdin $(TARGET) matches | grep -q '"match":false' + +clean: ## Clean the wasm file and the target directory + cargo clean || true + rm -f $(TARGET) + +help: ## Show this help + @echo "Variables:" + @make -pnf $(MAKEFILE_LIST) | awk '/^# (makefile |command)/{getline; print}' | grep -v "^MAKEFILE_LIST" | sort | uniq | awk 'BEGIN {FS = ":?= "}; {printf " \033[36m%-30s\033[0m %s\n", $$1, $$2}' + @echo "\nTargets:" + @grep -E '^[/%a-zA-Z0-9_-]+: .*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ": .*?## "}; {printf " \033[36m%-30s\033[0m %s\n", $$1, $$2}' + diff --git a/crates/wasm-blueprints/rust/README.md b/crates/wasm-blueprints/rust/README.md new file mode 100644 index 0000000..3df7405 --- /dev/null +++ b/crates/wasm-blueprints/rust/README.md @@ -0,0 +1,11 @@ +# Rust search filter + +Link explaining how to write a search filter +Link to Extism Rust PDK + +Blueprint project to write a search filter in Rust. +To build the WebAssembly module: +```bash +make build +yozf import-filter module.wasm --name 'key-ends-with' +``` \ No newline at end of file diff --git a/crates/wasm-blueprints/rust/src/lib.rs b/crates/wasm-blueprints/rust/src/lib.rs new file mode 100644 index 0000000..e35075b --- /dev/null +++ b/crates/wasm-blueprints/rust/src/lib.rs @@ -0,0 +1,48 @@ +#![no_main] +/// This filter returns `true` +/// when the key of the kafka record ends with the user-specified string parameter +/// +/// ```sql +/// key-ends-with("rust") +/// ``` +/// +/// This WebAssembly module relies on the [Extism SDK](https://extism.org/docs/quickstart/plugin-quickstart) +use extism_pdk::*; +use json::Value; + +use yozefu_wasm_types::{FilterInput, FilterResult}; + +#[plugin_fn] +pub fn matches(input: Json) -> FnResult> { + // TODO - Edit the code as per your requirements + let first_param = input.0.params.first().unwrap().as_str().unwrap(); + let key = input.0.record.key.raw(); + + Ok(Json(key.ends_with(first_param).into())) +} + +#[plugin_fn] +/// This function checks if the input parameters are valid +pub fn parse_parameters(params: Json>) -> FnResult<()> { + // TODO - Edit the code as per your requirements + let length = params.0.len(); + if length != 1 { + return Err(WithReturnCode::new( + Error::msg(format!( + "This search filter expects a string argument. Found {} arguments", + &length.to_string() + )), + 1, + )); + } + if params.0.first().unwrap().is_string() { + return Ok(()); + } + return Err(WithReturnCode::new( + Error::msg(format!( + "This search filter expects argument 1 to be a string, found {}", + json::to_string(params.0.first().unwrap()).unwrap() + )), + 2, + )); +} diff --git a/crates/wasm-blueprints/rust/tests/match.json b/crates/wasm-blueprints/rust/tests/match.json new file mode 100644 index 0000000..4a57d7c --- /dev/null +++ b/crates/wasm-blueprints/rust/tests/match.json @@ -0,0 +1,14 @@ +{ + "record": { + "value": "", + "key": "21965", + "topic": "public-french-addresses", + "timestamp": 1732479526752, + "partition": 0, + "offset": 4, + "headers": {} + }, + "params": [ + "965" + ] +} \ No newline at end of file diff --git a/crates/wasm-blueprints/rust/tests/no-match.json b/crates/wasm-blueprints/rust/tests/no-match.json new file mode 100644 index 0000000..1b6b8bf --- /dev/null +++ b/crates/wasm-blueprints/rust/tests/no-match.json @@ -0,0 +1,14 @@ +{ + "record": { + "value": "", + "key": "21965", + "topic": "public-french-addresses", + "timestamp": 1732479526752, + "partition": 0, + "offset": 4, + "headers": {} + }, + "params": [ + "easy-cookie" + ] +} \ No newline at end of file diff --git a/crates/wasm-blueprints/rust/tests/parameters.json b/crates/wasm-blueprints/rust/tests/parameters.json new file mode 100644 index 0000000..c501502 --- /dev/null +++ b/crates/wasm-blueprints/rust/tests/parameters.json @@ -0,0 +1 @@ +["a-key-suffix"] \ No newline at end of file diff --git a/crates/wasm-types/Cargo.toml b/crates/wasm-types/Cargo.toml new file mode 100644 index 0000000..7cd21d7 --- /dev/null +++ b/crates/wasm-types/Cargo.toml @@ -0,0 +1,18 @@ +[package] +name = "yozefu-wasm-types" +readme = "README.md" +description = "Types and structures for defining WebAssembly modules" +keywords = ["wasm", "kafka", "record"] +categories = [] +version.workspace = true +authors.workspace = true +edition.workspace = true +homepage.workspace = true +license.workspace = true +repository.workspace = true + + +[dependencies] +serde = { version = "1.0.215", features = ["derive"] } +serde_json = { version = "1.0.133", features = ["preserve_order"] } +lib = { workspace = true } diff --git a/crates/wasm-types/LICENSE b/crates/wasm-types/LICENSE new file mode 100644 index 0000000..9d188fa --- /dev/null +++ b/crates/wasm-types/LICENSE @@ -0,0 +1,13 @@ +Copyright [2024] yozefu-wasm-types + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. \ No newline at end of file diff --git a/crates/wasm-types/README.md b/crates/wasm-types/README.md new file mode 100644 index 0000000..cdad449 --- /dev/null +++ b/crates/wasm-types/README.md @@ -0,0 +1,13 @@ +# yozefu-wasm-types + +[![Build](https://github.com/MAIF/yozefu/actions/workflows/build.yml/badge.svg)](https://github.com/MAIF/yozefu/actions/workflows/build.yml) +[![](https://img.shields.io/crates/v/yozefu-wasm-types.svg)](https://crates.io/crates/yozefu-wasm-types) + +This library provides structures for defining a WebAssembly module for the search engine of [yozefu](https://github.com/MAIF/yozefu). It uses `json` to exchange data from the host to the WebAssembly module. + +## Usage + +**NOTE:** You probably don't want to use this crate directly. Instead you should run the command `create-filter`: +```bash +yozf create-filter --language rust key-ends-with +``` \ No newline at end of file diff --git a/crates/wasm-types/examples/input.json b/crates/wasm-types/examples/input.json new file mode 100644 index 0000000..406c6ac --- /dev/null +++ b/crates/wasm-types/examples/input.json @@ -0,0 +1,40 @@ +{ + "record": { + "value": { + "type": "Feature", + "geometry": { + "type": "Point", + "coordinates": [ + 4.0215, + 49.211613 + ] + }, + "properties": { + "label": "Rue Franz Kafka 51100 Reims", + "score": 0.6992354545454544, + "id": "51454_3782", + "name": "Rue Franz Kafka", + "postcode": "51100", + "citycode": "51454", + "x": 774438.44, + "y": 6901804.03, + "city": "Reims", + "context": "51, Marne, Grand Est", + "type": "street", + "importance": 0.69159, + "street": "Rue Franz Kafka" + } + }, + "key": 21965, + "topic": "public-french-addresses", + "timestamp": 1732479526752, + "partition": 0, + "offset": 4, + "headers": { + "my-header": "hello" + } + }, + "params": [ + "1234" + ] +} \ No newline at end of file diff --git a/crates/wasm-types/src/lib.rs b/crates/wasm-types/src/lib.rs new file mode 100644 index 0000000..3ed60ff --- /dev/null +++ b/crates/wasm-types/src/lib.rs @@ -0,0 +1,37 @@ +//! Crate containing the types used to define search filters. +pub use lib::FilterResult; +use lib::KafkaRecord; +use serde::{Deserialize, Serialize}; +use serde_json::Value; + +/// A filter receives a a struct composed of 2 fields: +/// - The consumed kafka record, +/// - The parameters passed to that filter, it can be numbers or strings +/// +/// +/// If the filter is `key-ends-with('1234')`, [FilterInput] serialized as json will be: +/// +/// ```json +/// { +/// "record": { +/// "value": "Hello world", +/// "key": "f240ff26-66b3-40d0-a99e-861300c24753", +/// "topic": "my-topic", +/// "timestamp": 1717842091489, +/// "partition": 0, +/// "offset": 23, +/// "headers": { +/// "my-header": "my-value" +/// } +/// }, +/// "params": [ +/// "1234" +/// ] +/// } +/// ``` +#[derive(Clone, Debug, Deserialize, Serialize)] +#[serde(rename_all = "lowercase")] +pub struct FilterInput { + pub record: KafkaRecord, + pub params: Vec, +} diff --git a/deny.toml b/deny.toml new file mode 100644 index 0000000..dc2bf17 --- /dev/null +++ b/deny.toml @@ -0,0 +1,247 @@ +# This template contains all of the possible sections and their default values + +# Note that all fields that take a lint level have these possible values: +# * deny - An error will be produced and the check will fail +# * warn - A warning will be produced, but the check will not fail +# * allow - No warning or error will be produced, though in some cases a note +# will be + +# The values provided in this template are the default values that will be used +# when any section or field is not specified in your own configuration + +# Root options + +# The graph table configures how the dependency graph is constructed and thus +# which crates the checks are performed against +[graph] +# If 1 or more target triples (and optionally, target_features) are specified, +# only the specified targets will be checked when running `cargo deny check`. +# This means, if a particular package is only ever used as a target specific +# dependency, such as, for example, the `nix` crate only being used via the +# `target_family = "unix"` configuration, that only having windows targets in +# this list would mean the nix crate, as well as any of its exclusive +# dependencies not shared by any other crates, would be ignored, as the target +# list here is effectively saying which targets you are building for. +targets = [ + "x86_64-unknown-linux-gnu", + "aarch64-unknown-linux-gnu", + "aarch64-apple-darwin", + "x86_64-apple-darwin", + "x86_64-pc-windows-gnu", + "x86_64-pc-windows-msvc" +] +# When creating the dependency graph used as the source of truth when checks are +# executed, this field can be used to prune crates from the graph, removing them +# from the view of cargo-deny. This is an extremely heavy hammer, as if a crate +# is pruned from the graph, all of its dependencies will also be pruned unless +# they are connected to another crate in the graph that hasn't been pruned, +# so it should be used with care. The identifiers are [Package ID Specifications] +# (https://doc.rust-lang.org/cargo/reference/pkgid-spec.html) +#exclude = [] +# If true, metadata will be collected with `--all-features`. Note that this can't +# be toggled off if true, if you want to conditionally enable `--all-features` it +# is recommended to pass `--all-features` on the cmd line instead +all-features = false +# If true, metadata will be collected with `--no-default-features`. The same +# caveat with `all-features` applies +no-default-features = false +# If set, these feature will be enabled when collecting metadata. If `--features` +# is specified on the cmd line they will take precedence over this option. +#features = [] + +# The output table provides options for how/if diagnostics are outputted +[output] +# When outputting inclusion graphs in diagnostics that include features, this +# option can be used to specify the depth at which feature edges will be added. +# This option is included since the graphs can be quite large and the addition +# of features from the crate(s) to all of the graph roots can be far too verbose. +# This option can be overridden via `--feature-depth` on the cmd line +feature-depth = 1 + +# This section is considered when running `cargo deny check advisories` +# More documentation for the advisories section can be found here: +# https://embarkstudios.github.io/cargo-deny/checks/advisories/cfg.html +[advisories] +# The path where the advisory databases are cloned/fetched into +#db-path = "$CARGO_HOME/advisory-dbs" +# The url(s) of the advisory databases to use +#db-urls = ["https://github.com/rustsec/advisory-db"] +# A list of advisory IDs to ignore. Note that ignored advisories will still +# output a note when they are encountered. +ignore = [ + + { id = "RUSTSEC-2024-0320", reason = "erff " }, + #"RUSTSEC-0000-0000", + #{ id = "RUSTSEC-0000-0000", reason = "you can specify a reason the advisory is ignored" }, + #"a-crate-that-is-yanked@0.1.1", # you can also ignore yanked crate versions if you wish + #{ crate = "a-crate-that-is-yanked@0.1.1", reason = "you can specify why you are ignoring the yanked crate" }, +] +# If this is true, then cargo deny will use the git executable to fetch advisory database. +# If this is false, then it uses a built-in git library. +# Setting this to true can be helpful if you have special authentication requirements that cargo-deny does not support. +# See Git Authentication for more information about setting up git authentication. +#git-fetch-with-cli = true + +# This section is considered when running `cargo deny check licenses` +# More documentation for the licenses section can be found here: +# https://embarkstudios.github.io/cargo-deny/checks/licenses/cfg.html +[licenses] +# List of explicitly allowed licenses +# See https://spdx.org/licenses/ for list of possible licenses +# [possible values: any SPDX 3.11 short identifier (+ optional exception)]. +allow = [ + "Apache-2.0", + "Apache-2.0 WITH LLVM-exception", + "MIT", + "MPL-2.0", + "BSD-3-Clause", + "ISC", + "Unicode-3.0", + "OpenSSL" +] +# The confidence threshold for detecting a license from license text. +# The higher the value, the more closely the license text must be to the +# canonical license text of a valid SPDX license file. +# [possible values: any between 0.0 and 1.0]. +confidence-threshold = 0.8 +# Allow 1 or more licenses on a per-crate basis, so that particular licenses +# aren't accepted for every possible crate as with the normal allow list +exceptions = [ + # Each entry is the crate and version constraint, and its specific allow + # list + { allow = ["Zlib"], crate = "foldhash" }, + { allow = ["BSL-1.0"], crate = "lazy-bytes-cast" }, + { allow = ["Zlib"], crate = "throbber-widgets-tui" }, +] + +# Some crates don't have (easily) machine readable licensing information, +# adding a clarification entry for it allows you to manually specify the +# licensing information +[[licenses.clarify]] +name = "ring" +expression = "ISC AND MIT AND OpenSSL" +license-files = [ + { path = "LICENSE", hash = 0xbd0eed23 } +] +# The package spec the clarification applies to + +# The SPDX expression for the license requirements of the crate +# One or more files in the crate's source used as the "source of truth" for +# the license expression. If the contents match, the clarification will be used +# when running the license check, otherwise the clarification will be ignored +# and the crate will be checked normally, which may produce warnings or errors +# depending on the rest of your configuration +#license-files = [ +# Each entry is a crate relative path, and the (opaque) hash of its contents +#{ path = "LICENSE", hash = 0xbd0eed23 } +#] + +[licenses.private] +# If true, ignores workspace crates that aren't published, or are only +# published to private registries. +# To see how to mark a crate as unpublished (to the official registry), +# visit https://doc.rust-lang.org/cargo/reference/manifest.html#the-publish-field. +ignore = false +# One or more private registries that you might publish crates to, if a crate +# is only published to private registries, and ignore is true, the crate will +# not have its license(s) checked +registries = [ + #"https://sekretz.com/registry +] + +# This section is considered when running `cargo deny check bans`. +# More documentation about the 'bans' section can be found here: +# https://embarkstudios.github.io/cargo-deny/checks/bans/cfg.html +[bans] +# Lint level for when multiple versions of the same crate are detected +multiple-versions = "warn" +# Lint level for when a crate version requirement is `*` +wildcards = "allow" +# The graph highlighting used when creating dotgraphs for crates +# with multiple versions +# * lowest-version - The path to the lowest versioned duplicate is highlighted +# * simplest-path - The path to the version with the fewest edges is highlighted +# * all - Both lowest-version and simplest-path are used +highlight = "all" +# The default lint level for `default` features for crates that are members of +# the workspace that is being checked. This can be overridden by allowing/denying +# `default` on a crate-by-crate basis if desired. +workspace-default-features = "allow" +# The default lint level for `default` features for external crates that are not +# members of the workspace. This can be overridden by allowing/denying `default` +# on a crate-by-crate basis if desired. +external-default-features = "allow" +# List of crates that are allowed. Use with care! +allow = [ + #"ansi_term@0.11.0", + #{ crate = "ansi_term@0.11.0", reason = "you can specify a reason it is allowed" }, +] +# List of crates to deny +deny = [ + #"ansi_term@0.11.0", + #{ crate = "ansi_term@0.11.0", reason = "you can specify a reason it is banned" }, + # Wrapper crates can optionally be specified to allow the crate when it + # is a direct dependency of the otherwise banned crate + #{ crate = "ansi_term@0.11.0", wrappers = ["this-crate-directly-depends-on-ansi_term"] }, +] + +# List of features to allow/deny +# Each entry the name of a crate and a version range. If version is +# not specified, all versions will be matched. +#[[bans.features]] +#crate = "reqwest" +# Features to not allow +#deny = ["json"] +# Features to allow +#allow = [ +# "rustls", +# "__rustls", +# "__tls", +# "hyper-rustls", +# "rustls", +# "rustls-pemfile", +# "rustls-tls-webpki-roots", +# "tokio-rustls", +# "webpki-roots", +#] +# If true, the allowed features must exactly match the enabled feature set. If +# this is set there is no point setting `deny` +#exact = true + +# Certain crates/versions that will be skipped when doing duplicate detection. +skip = [ + #"ansi_term@0.11.0", + #{ crate = "ansi_term@0.11.0", reason = "you can specify a reason why it can't be updated/removed" }, +] +# Similarly to `skip` allows you to skip certain crates during duplicate +# detection. Unlike skip, it also includes the entire tree of transitive +# dependencies starting at the specified crate, up to a certain depth, which is +# by default infinite. +skip-tree = [ + #"ansi_term@0.11.0", # will be skipped along with _all_ of its direct and transitive dependencies + #{ crate = "ansi_term@0.11.0", depth = 20 }, +] + +# This section is considered when running `cargo deny check sources`. +# More documentation about the 'sources' section can be found here: +# https://embarkstudios.github.io/cargo-deny/checks/sources/cfg.html +[sources] +# Lint level for what to happen when a crate from a crate registry that is not +# in the allow list is encountered +unknown-registry = "warn" +# Lint level for what to happen when a crate from a git repository that is not +# in the allow list is encountered +unknown-git = "warn" +# List of URLs for allowed crate registries. Defaults to the crates.io index +# if not specified. If it is specified but empty, no registries are allowed. +allow-registry = ["https://github.com/rust-lang/crates.io-index"] +# List of URLs for allowed Git repositories +allow-git = [] + +[sources.allow-org] +# github.com organizations to allow git sources for +github = [] +# gitlab.com organizations to allow git sources for +gitlab = [] +# bitbucket.org organizations to allow git sources for +bitbucket = [] diff --git a/docs/query-language/README.md b/docs/query-language/README.md new file mode 100644 index 0000000..66bd002 --- /dev/null +++ b/docs/query-language/README.md @@ -0,0 +1,42 @@ +# Yōzefu query language. + + +Yōzefu uses a home-made query language inspired of the SQL syntax to search for kafka records. The BNF-like grammar is [available here](https://github.com/MAIF/yozefu/blob/main/crates/lib/src/search/mod.rs). here are some examples of queries: + +1. The 20 first records on partition 2 where the key contains `foo`: + +```sql +from begin +partition == 2 +and key contains "foo" +limit 20 +``` + + +2. Records where the offset is greater or equal than 3_460, only on partition 1 and 4, the json value must have a property `album.title` equal to `Virtue`: + +```sql +from begin where +(partition == 1 || partition == 4) +and offset >= 3_460 +and value.album.title == "Virtue" +``` + +3. Among the last 500 records, list the records where the size is bigger than 5Mb: +```sql +from end - 500 +size > 5_000_000 +``` + +4. Is there a record that contains `release` during a specific time range? +```sql +from "2024-11-23T12:00:00.000+01:00" +where timestamp between "2024-11-23T12:00:00.000+01:00" and "2024-11-23T15:00:00.000+01:00" +value contains "release" +``` + + +5. Records where the `md5(key)` is equals to the user-provided parameter. A [search filter](./search-filter/README.md) must be implemented for this example. +```sql +from begin md5-key-equals-to("d131dd02c5e6eec4693d9a0698aff95c2fcab58712467eab4004583eb8fb7f89") +``` \ No newline at end of file diff --git a/docs/release/README.md b/docs/release/README.md new file mode 100644 index 0000000..d033506 --- /dev/null +++ b/docs/release/README.md @@ -0,0 +1,12 @@ +# Releasing a new version + +This document explains the release process of the tool. Most of the steps are automated with github actions. The `main` branch of the repository is protected. If you want to release a new version of the tool, you must create a release branch. + + +1. Ensure you are on a release branch. +2. Install `cargo-release`: `cargo install cargo-release` +3. Thanks to conventional commits and `cargo-semver-checks`, you can determine the next version to release. +4. Bump the version: `cargo release "" --no-publish --no-confirm --execute --no-tag` +5. [Create the pull request](https://github.com/MAIF/yozefu/compare). +6. If all checks succeed, the pull request will be accepted by a reviewer. +7. When a new release is created, a github action workflow is triggered to update the changelog. The updated changelog will be available on a branch named `changelog/`. A pull request is automatically created with that branch, which must be approved to be merged into `main`. \ No newline at end of file diff --git a/docs/schema-registry/README.md b/docs/schema-registry/README.md new file mode 100644 index 0000000..23ca65c --- /dev/null +++ b/docs/schema-registry/README.md @@ -0,0 +1,78 @@ +# Schema registry + +⚠️ The support of the schema registry in Yozefu is **highly experimental**. Contributions and feedback are welcome: + +| Types | Support | +| ----------- | :-------------------- | +| Json schema | Experimental | +| Avro | Experimental | +| Protobuf | Not supported for now | + + + +You can configure the tool to use a schema registry. Open the configuration file `yozf configure`, and add a `schema_registry` entry to your cluster: +```json +{ + "clusters": { + "localhost": { + "url_template": "http://localhost:9000/ui/kafka-localhost-server/topic/{topic}/data?single=true&partition={partition}&offset={offset}", + "schema_registry": { + "url": "http://localhost:8081" + }, + "kafka": { + "bootstrap.servers": "localhost:9092", + "security.protocol": "plaintext", + "broker.address.family": "v4" + } + } + } +} +``` + + + + + +## Basic auth + +If the schema registry is protected by a basic authentication, you can add the `Authorization` header: + +```json +{ + "schema_registry": { + "url": "https://acme-schema-registry:8081", + "headers": { + "Authorization": "Basic am9obkBleGFtcGxlLmNvbTphYmMxMjM=" + } + } +} +``` + + +## Bearer token + +```json +{ + "schema_registry": { + "url": "https://acme-schema-registry:8081", + "headers": { + "Authorization": "Bearer " + } + } +} + +``` + + + + +## Authentication methods per provider + + +| Nom | Authentication methods | +| ----------------------------------------------------------------------------------- | ------------------------------- | +| [Confluent](https://confluent.cloud) | Basic authentication
OAuth | +| [Redpanda Cloud](https://docs.redpanda.com/current/manage/security/authentication/) | Basic authentication
OAuth | + + + diff --git a/docs/schemas/MyProducer.java b/docs/schemas/MyProducer.java new file mode 100644 index 0000000..d5effb1 --- /dev/null +++ b/docs/schemas/MyProducer.java @@ -0,0 +1,293 @@ +//JAVA 21+ +//REPOS central,confluent=https://packages.confluent.io/maven +//DEPS com.fasterxml.jackson.core:jackson-databind:2.18.1 +//DEPS org.apache.kafka:kafka-clients:3.9.0 +//DEPS io.confluent:kafka-protobuf-serializer:7.7.1 +//DEPS io.confluent:kafka-avro-serializer:7.7.1 +//DEPS io.confluent:kafka-json-schema-serializer:7.7.1 +//DEPS io.confluent:kafka-protobuf-serializer:7.7.1 +//DEPS org.slf4j:slf4j-nop:2.0.16 +//DEPS tech.allegro.schema.json2avro:converter:0.2.15 +//DEPS com.google.protobuf:protobuf-java:3.25.4 +//DEPS info.picocli:picocli:4.7.6 + +//FILES avro/key-schema.json=avro/key-schema.json +//FILES avro/value-schema.json=avro/value-schema.json +//FILES json-schema/value-schema.json=json-schema/value-schema.json +//FILES json-schema/key-schema.json=json-schema/key-schema.json +//FILES protobuf/key-schema.proto=protobuf/key-schema.proto +//FILES protobuf/value-schema.proto=protobuf/value-schema.proto + + +import com.fasterxml.jackson.core.JsonProcessingException; +import com.fasterxml.jackson.databind.JsonNode; +import com.fasterxml.jackson.databind.node.TextNode; +import com.fasterxml.jackson.databind.ObjectMapper; +import io.confluent.kafka.schemaregistry.json.JsonSchemaUtils; +import io.confluent.kafka.serializers.KafkaAvroSerializer; +import io.confluent.kafka.serializers.json.KafkaJsonSchemaSerializer; +import org.apache.kafka.common.serialization.ByteArraySerializer; +import org.apache.kafka.common.serialization.StringSerializer; +import io.confluent.kafka.serializers.protobuf.KafkaProtobufSerializer; +import org.apache.avro.generic.GenericData; +import org.apache.avro.Schema; +import org.apache.avro.generic.GenericRecord; +import org.apache.kafka.clients.producer.*; +import com.google.protobuf.DynamicMessage; + +import java.util.*; + +import io.confluent.kafka.schemaregistry.protobuf.ProtobufSchemaUtils; + +import java.io.ByteArrayOutputStream; +import java.io.IOException; +import java.net.URI; +import java.net.http.HttpClient; +import java.net.http.HttpRequest; +import java.net.http.HttpResponse; +import java.nio.charset.StandardCharsets; +import java.util.concurrent.Callable; +import java.util.stream.Collectors; + +import io.confluent.kafka.schemaregistry.protobuf.ProtobufSchema; +import picocli.CommandLine; +import picocli.CommandLine.Command; +import tech.allegro.schema.json2avro.converter.JsonAvroConverter; + + +enum SerializerType { + avro, json, jsonSchema, protobuf, text, malformed +} + +@Command(name = "MyProducer.java", version = "1.0.0", mixinStandardHelpOptions = true, + description = "Tool to produce kafka records with different serializers." +) +class MyProducer implements Callable { + + @CommandLine.Option(names = {"--topic"}, description = "The topic to produce records to.") + private String topic = "public-french-addresses"; + + @CommandLine.Option(names = {"--type"}, description = "avro, json, jsonSchema, protobuf, text or malformed", defaultValue = "json") + private SerializerType type = SerializerType.json; + + @CommandLine.Parameters(description = "Your query passed to 'https://api-adresse.data.gouv.fr/search/?q='", defaultValue = "kafka") + private String query; + + @Override + public Integer call() throws Exception { + Properties props = new Properties(); + props.put("bootstrap.servers", "localhost:9092"); + var schemaRegistryUrl = System.getenv().getOrDefault("YOZEFU_SCHEMA_REGISTRY_URL", "http://localhost:8081"); + props.put("schema.registry.url", System.getenv().getOrDefault("YOZEFU_SCHEMA_REGISTRY_URL", schemaRegistryUrl)); + System.err.printf(" 📖 schema registry URL is %s\n", schemaRegistryUrl); + + var url = System.getenv().getOrDefault("YOZEFU_API_URL", "https://api-adresse.data.gouv.fr/search/?q=%s"); + System.err.printf(" 🔩 The API is '%s'\n", url); + var data = get(url, query); + + System.err.printf(" 📣 About to producing records to topic '%s', serialization type is '%s'\n", topic, type); + switch (type) { + case avro -> { + props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class.getName()); + props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class.getName()); + KafkaProducer producer = new KafkaProducer<>(props); + produce(producer, new IntoAvro(), data, topic); + } + case json -> { + props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName()); + props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName()); + KafkaProducer producer = new KafkaProducer<>(props); + produce(producer, new IntoJson(), data, topic); + } + case jsonSchema -> { + props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KafkaJsonSchemaSerializer.class.getName()); + props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaJsonSchemaSerializer.class.getName()); + KafkaProducer producer = new KafkaProducer<>(props); + produce(producer, new IntoJsonSchema(), data, topic); + } + case protobuf -> { + System.err.printf(" ⚠️ Protobuf serialization is experimental and may not work as expected\n"); + props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KafkaProtobufSerializer.class.getName()); + props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaProtobufSerializer.class.getName()); + KafkaProducer producer = new KafkaProducer<>(props); + produce(producer, new IntoProtobuf(), data, topic); + } + case text -> { + props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName()); + props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName()); + KafkaProducer producer = new KafkaProducer<>(props); + produce(producer, new IntoText(), data, topic); + } + case malformed -> { + props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, ByteArraySerializer.class); + props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, ByteArraySerializer.class); + KafkaProducer producer = new KafkaProducer<>(props); + produce(producer, new IntoMalformed(), data, topic); + } + default -> { + System.err.printf(" ❕ Format '%s' is unknown. Known formats are ['avro', 'json', 'json-schema', 'text', 'malformed']\n", type); + props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName()); + props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName()); + KafkaProducer producer = new KafkaProducer<>(props); + produce(producer, new IntoText(), data, topic); + } + } + return 0; + } + + public static void produce(KafkaProducer producer, Into mapper, List addresses, String topic) throws Exception { + for (var address : addresses) { + var record = mapper.into(address, topic); + producer.send(record, onSend()); + } + producer.flush(); + producer.close(); + } + + private static Callback onSend() { + return (RecordMetadata metadata, Exception exception) -> { + if (exception != null) { + exception.printStackTrace(); + } else { + System.err.println(" A new record has been produced to partition " + metadata.partition() + " with offset " + metadata.offset()); + } + }; + } + + private static List get(String apiUrl, String query) throws IOException, InterruptedException { + System.err.printf(" 🏡 Searching french addresses matching the query '%s'\n", query); + var url = String.format(apiUrl, query.trim().toLowerCase()); + + try(var client = HttpClient.newHttpClient()) { + var request = HttpRequest.newBuilder() + .header("Accept", "application/json") + .uri(URI.create(url)) + .build(); + var response = client.send(request, HttpResponse.BodyHandlers.ofString()); + var body = response.body(); + ObjectMapper mapper = new ObjectMapper(); + // System.err.println(body); + JsonNode node = mapper.readTree(body); + List addresses = new ArrayList<>(); + if(node.isArray()) { + for (JsonNode n : node) { + addresses.add(n); + } + } + if(node.isObject()) { + for (JsonNode n : node.get("features")) { + addresses.add(n); + } + } + return addresses.stream().map(JsonNode::toString).collect(Collectors.toList()); + } + } + + public static void main(String[] args) { + int exitCode = new CommandLine(new MyProducer()) + .setCaseInsensitiveEnumValuesAllowed(true) + .execute(args); + System.exit(exitCode); + } + +} + + +interface Into { + ProducerRecord into(String value, String topic) throws Exception; + + default String generateKey() { + return UUID.randomUUID().toString(); + } + + default String readResource(String path) throws Exception { + try(var in = Into.class.getResourceAsStream(path)) { + return new String(in.readAllBytes(), StandardCharsets.UTF_8); + } + } +} + +class IntoText implements Into { + public ProducerRecord into(String value, String topic) throws JsonProcessingException { + var objectMapper = new ObjectMapper(); + var object = objectMapper.readTree(value); + return new ProducerRecord<>(topic, this.generateKey(), object.get("properties").get("label").asText()); + } +} + +class IntoJson implements Into { + public ProducerRecord into(String value, String topic) { + return new ProducerRecord<>(topic, generateKey(), value); + } +} + +class IntoJsonSchema implements Into { + public ProducerRecord into(String input, String topic) throws Exception { + var objectMapper = new ObjectMapper(); + var keySchemaString = readResource("/json-schema/key-schema.json"); + var valueSchemaString = readResource("/json-schema/value-schema.json"); + var keySchema = objectMapper.readTree(keySchemaString); + var valueSchema = objectMapper.readTree(valueSchemaString); + + var key = TextNode.valueOf(generateKey()); + var keyEnvelope = JsonSchemaUtils.envelope(keySchema, key); + + var value = objectMapper.readTree(input); + var valueEnvelope = JsonSchemaUtils.envelope(valueSchema, value); + + return new ProducerRecord<>(topic, keyEnvelope, valueEnvelope); + } +} + +class IntoAvro implements Into { + public ProducerRecord into(String input, String topic) throws Exception { + var keySchemaString = readResource("/avro/key-schema.json"); + var valueSchemaString = readResource("/avro/value-schema.json"); + + Schema.Parser schemaParser = new Schema.Parser(); + Schema keySchema = schemaParser.parse(keySchemaString); + Schema valueSchema = schemaParser.parse(valueSchemaString); + JsonAvroConverter converter = new JsonAvroConverter(); + + var keyString = String.format("{ \"id\": \"%s\", \"sunny\": %s }", generateKey(), new Random().nextBoolean()); + GenericData.Record key = converter.convertToGenericDataRecord(keyString.getBytes(), keySchema); + GenericData.Record value = converter.convertToGenericDataRecord(input.getBytes(), valueSchema); + return new ProducerRecord<>(topic, key, value); + } +} + +// TODO work in progress +class IntoProtobuf implements Into { + public ProducerRecord into(String input, String topic) throws Exception { + var keySchemaString = readResource("/protobuf/key-schema.proto"); + var valueSchemaString = readResource("/protobuf/value-schema.proto"); + + ProtobufSchema keySchema = new ProtobufSchema(keySchemaString); + var keyString = String.format("{\"id\": \"%s\"}", this.generateKey()); + var key = (DynamicMessage) ProtobufSchemaUtils.toObject(keyString, keySchema); + + ProtobufSchema valueSchema = new ProtobufSchema(valueSchemaString); + var value = (DynamicMessage) ProtobufSchemaUtils.toObject(input, valueSchema); + + return new ProducerRecord<>(topic, key, value); + } +} + +class IntoMalformed implements Into { + public ProducerRecord into(String input, String topic) throws Exception { + byte randomSchemaId = (byte) ((Math.random() * (127 - 1)) + 1); + var header = new byte[]{0, 0, 0, 0, randomSchemaId}; + + ByteArrayOutputStream keyOutput = new ByteArrayOutputStream(); + keyOutput.write(header); + keyOutput.write((generateKey() + " key").getBytes()); + + ByteArrayOutputStream valueOutput = new ByteArrayOutputStream(); + valueOutput.write(header); + var objectMapper = new ObjectMapper(); + var object = objectMapper.readTree(input); + valueOutput.write(object.get("properties").get("context").asText().getBytes(StandardCharsets.UTF_8)); + + return new ProducerRecord<>(topic, keyOutput.toByteArray(), valueOutput.toByteArray()); + } +} \ No newline at end of file diff --git a/docs/schemas/avro/key-schema.json b/docs/schemas/avro/key-schema.json new file mode 100644 index 0000000..8be8ce7 --- /dev/null +++ b/docs/schemas/avro/key-schema.json @@ -0,0 +1,15 @@ +{ + "name": "Key", + "namespace": "io.maif.yozefu", + "type": "record", + "fields": [ + { + "name": "id", + "type": "string" + }, + { + "name": "sunny", + "type": "boolean" + } + ] +} \ No newline at end of file diff --git a/docs/schemas/avro/value-schema.json b/docs/schemas/avro/value-schema.json new file mode 100644 index 0000000..3d56c4b --- /dev/null +++ b/docs/schemas/avro/value-schema.json @@ -0,0 +1,247 @@ +{ + "name": "Feature", + "namespace": "io.maif.yozefu", + "doc": "A GeoJSON Feature object.", + "type": "record", + "fields": [ + { + "name": "type", + "type": { + "type": "enum", + "name": "FeatureType", + "symbols": [ + "Feature" + ] + }, + "default": "Feature" + }, + { + "name": "geometry", + "type": [ + { + "name": "Point", + "doc": "Describes a point geometry", + "type": "record", + "fields": [ + { + "name": "type", + "type": { + "type": "enum", + "name": "PointType", + "symbols": [ + "Point" + ] + }, + "default": "Point" + }, + { + "name": "coordinates", + "type": { + "type": "array", + "items": "double" + } + } + ] + }, + { + "name": "MultiPoint", + "namespace": "io.maif.yozefu.sim.support.geojson.geometry", + "doc": "Describes a collection of points geometry", + "type": "record", + "fields": [ + { + "name": "type", + "type": { + "name": "MultiPointType", + "namespace": "io.maif.yozefu.sim.support.geojson.geometry", + "type": "enum", + "symbols": [ + "MultiPoint" + ] + }, + "default": "MultiPoint" + }, + { + "name": "coordinates", + "type": { + "type": "array", + "items": { + "type": "array", + "items": "double" + } + } + } + ] + }, + { + "name": "LineString", + "doc": "Describes a LineString geometry", + "type": "record", + "fields": [ + { + "name": "type", + "type": { + "type": "enum", + "name": "LineStringType", + "symbols": [ + "LineString" + ] + }, + "default": "LineString" + }, + { + "name": "coordinates", + "type": { + "type": "array", + "name": "CoordinatesType", + "items": { + "type": "array", + "items": "double" + } + } + } + ] + }, + { + "name": "MultiLineString", + "doc": "Describes a MultiLineString geometry", + "type": "record", + "fields": [ + { + "name": "type", + "type": { + "type": "enum", + "name": "MultiLineStringType", + "symbols": [ + "MultiLineString" + ] + }, + "default": "MultiLineString" + }, + { + "name": "coordinates", + "type": { + "type": "array", + "name": "CoordinatesType", + "items": { + "type": "array", + "items": { + "type": "array", + "items": "double" + } + } + } + } + ] + }, + { + "name": "Polygon", + "doc": "Describes a Polygon geometry", + "type": "record", + "fields": [ + { + "name": "type", + "type": { + "type": "enum", + "name": "PolygonType", + "symbols": [ + "Polygon" + ] + }, + "default": "Polygon" + }, + { + "name": "coordinates", + "type": { + "type": "array", + "name": "CoordinatesType", + "items": { + "type": "array", + "items": { + "type": "array", + "items": "double" + } + } + } + } + ] + }, + { + "name": "MultiPolygon", + "doc": "Describes a MultiPolygon geometry", + "type": "record", + "fields": [ + { + "name": "type", + "type": { + "type": "enum", + "name": "MultiPolygonType", + "symbols": [ + "MultiPolygon" + ] + }, + "default": "MultiPolygon" + }, + { + "name": "coordinates", + "type": { + "type": "array", + "items": { + "type": "array", + "items": { + "type": "array", + "items": { + "type": "array", + "items": "double" + } + } + } + } + } + ] + } + ] + }, + { + "name": "properties", + "doc": "Any type, without infinite nesting, should be replaced during actual usage with a record with named properties.", + "type": { + "name": "Json", + "type": "map", + "values": [ + "null", + "boolean", + "string", + "int", + "long", + "float", + "double", + { + "type": "array", + "items": [ + "null", + "boolean", + "string", + "int", + "long", + "float", + "double" + ] + }, + { + "type": "map", + "values": [ + "null", + "boolean", + "string", + "int", + "long", + "float", + "double" + ] + } + ] + } + } + ] +} \ No newline at end of file diff --git a/docs/schemas/json-schema/key-schema.json b/docs/schemas/json-schema/key-schema.json new file mode 100644 index 0000000..e34b32f --- /dev/null +++ b/docs/schemas/json-schema/key-schema.json @@ -0,0 +1,5 @@ +{ + "$schema": "http://json-schema.org/draft-07/schema#", + "title": "A UUID", + "type": "string" +} \ No newline at end of file diff --git a/docs/schemas/json-schema/value-schema.json b/docs/schemas/json-schema/value-schema.json new file mode 100644 index 0000000..426f1e1 --- /dev/null +++ b/docs/schemas/json-schema/value-schema.json @@ -0,0 +1,506 @@ +{ + "$schema": "http://json-schema.org/draft-07/schema#", + "$id": "https://geocoders.github.io/geocodejson-spec/draft/geocodejson.schema.json", + "description": "GeocodeJSON is an extension of the GeoJSON format and it is an attempt to create a standard for handling geocoding results.", + "title": "GeoJSON Feature", + "type": "object", + "required": [ + "type", + "properties", + "geometry" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "Feature" + ] + }, + "id": { + "oneOf": [ + { + "type": "number" + }, + { + "type": "string" + } + ] + }, + "properties": { + "oneOf": [ + { + "type": "null" + }, + { + "type": "object" + } + ] + }, + "geometry": { + "oneOf": [ + { + "type": "null" + }, + { + "title": "GeoJSON Point", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "Point" + ] + }, + "coordinates": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON LineString", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "LineString" + ] + }, + "coordinates": { + "type": "array", + "minItems": 2, + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON Polygon", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "Polygon" + ] + }, + "coordinates": { + "type": "array", + "items": { + "type": "array", + "minItems": 4, + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON MultiPoint", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "MultiPoint" + ] + }, + "coordinates": { + "type": "array", + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON MultiLineString", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "MultiLineString" + ] + }, + "coordinates": { + "type": "array", + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON MultiPolygon", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "MultiPolygon" + ] + }, + "coordinates": { + "type": "array", + "items": { + "type": "array", + "items": { + "type": "array", + "minItems": 4, + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON GeometryCollection", + "type": "object", + "required": [ + "type", + "geometries" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "GeometryCollection" + ] + }, + "geometries": { + "type": "array", + "items": { + "oneOf": [ + { + "title": "GeoJSON Point", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "Point" + ] + }, + "coordinates": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON LineString", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "LineString" + ] + }, + "coordinates": { + "type": "array", + "minItems": 2, + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON Polygon", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "Polygon" + ] + }, + "coordinates": { + "type": "array", + "items": { + "type": "array", + "minItems": 4, + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON MultiPoint", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "MultiPoint" + ] + }, + "coordinates": { + "type": "array", + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON MultiLineString", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "MultiLineString" + ] + }, + "coordinates": { + "type": "array", + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + }, + { + "title": "GeoJSON MultiPolygon", + "type": "object", + "required": [ + "type", + "coordinates" + ], + "properties": { + "type": { + "type": "string", + "enum": [ + "MultiPolygon" + ] + }, + "coordinates": { + "type": "array", + "items": { + "type": "array", + "items": { + "type": "array", + "minItems": 4, + "items": { + "type": "array", + "minItems": 2, + "items": { + "type": "number" + } + } + } + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + } + ] + } + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } + } + ] + }, + "bbox": { + "type": "array", + "minItems": 4, + "items": { + "type": "number" + } + } + } +} \ No newline at end of file diff --git a/docs/schemas/protobuf/key-schema.proto b/docs/schemas/protobuf/key-schema.proto new file mode 100644 index 0000000..304658b --- /dev/null +++ b/docs/schemas/protobuf/key-schema.proto @@ -0,0 +1,5 @@ +syntax = "proto3"; + +message Key { + string id = 1; +} \ No newline at end of file diff --git a/docs/schemas/protobuf/value-schema.proto b/docs/schemas/protobuf/value-schema.proto new file mode 100644 index 0000000..4cea73c --- /dev/null +++ b/docs/schemas/protobuf/value-schema.proto @@ -0,0 +1,35 @@ +syntax = "proto3"; + +message Address { + + message Geometry { + string type = 1; + repeated double coordinates = 2; + } + + message Properties { + string label = 1; + double score = 2; + string id = 3; + string name = 4; + string postcode = 5; + string citycode = 6; + double x = 7; + double y = 8; + string city = 9; + string context = 10; + string type = 11; + double importance = 12; + string street = 13; + optional string banId = 14; + optional double population = 15; + optional string municipality = 16; + optional string locality = 17; + optional string oldcitycode = 18; + optional string oldcity = 19; + } + + string type = 1; + Geometry geometry = 2; + Properties properties = 3; +} \ No newline at end of file diff --git a/docs/screenshots/dark.png b/docs/screenshots/dark.png new file mode 100644 index 0000000..8c22595 Binary files /dev/null and b/docs/screenshots/dark.png differ diff --git a/docs/screenshots/light.png b/docs/screenshots/light.png new file mode 100644 index 0000000..b3c8613 Binary files /dev/null and b/docs/screenshots/light.png differ diff --git a/docs/search-filter/README.md b/docs/search-filter/README.md new file mode 100644 index 0000000..5613a05 --- /dev/null +++ b/docs/search-filter/README.md @@ -0,0 +1,121 @@ +
+logo of WebAssembly +

Creating a search filter.

+
+ +Let's say you want to list all kafka records where the key ends with `1234`. +Currently, the query syntax doesn't offer such feature. Fortunately, you have the ability to extend the search engine and create your own search logic. + +You can implement what it's called **a search filter**. A search filter is a WebAssembly module that exports a `parse_parameters` and `matches` functions. You can pass string or number parameters to that function. It will look like this: + +```sql +-- from the beginning of the topic, retrieve all records where the key ends with '1234' +from begin where key-ends-with('1234') +``` + +The name of the function corresponds to the name of the wasm file. In the example above, the wasm file is `key-ends-with.wasm`. + +> [!TIP] +> Wasm files can be found at `yozf config get filters-dir`. + + + +## Defining your search filter + + +Yōzefu relies on [Extism](https://extism.org/) to develop and execute search filters. +The WebAssembly module we're going to implement must export 2 functions, `parse_parameters` and `matches`. + +The first step is to choose your preferred programming language. Extism supports different programming languages. You can read more at [Extism Quickstart Guide](https://extism.org/docs/quickstart/plugin-quickstart). I'll choose [golang](../../crates/wasm-blueprints/golang) for this example. A [Rust example](../../crates/wasm-blueprints/rust) is also available. + +```bash +yozf create-filter --language golang key-ends-with --directory /tmp/my-filter + +$EDITOR /tmp/my-filter +``` + +If you need more context about how WebAssembly is called from the Rust codebase, feel free to explore [filter.rs](../../crates/app/src/search/filter.rs). + +### Function `parse_parameters` + +This function ensures that user-provided parameters are valid. This function is called once at parsing time. +The function must return a status code `0` when it's valid. Another status code means the parameters are invalid. + +```golang +// golang example +// https://github.com/MAIF/yozefu/blob/main/crates/wasm-blueprints/golang/main.go + +//export parse_parameters +func ParseParameters() int32 { + var params FilterParams + err := pdk.InputJSON(¶ms) + if err != nil { + pdk.SetError(err) + return 1 + } + if len(params) != 1 { + pdk.SetError(errors.New(fmt.Sprintf("This search filter expects a string argument. Found %v arguments", len(params)))) + return 2 + } + return 0 +} +``` + + + +### Function `matches` + +This function receives a [JSON object](./filter-input.json) containing both the kafka record and the function parameters. It returns the json `{"match": true}` when the record matches your query. The output is represented by the struct [`FilterResult`](https://github.com/MAIF/yozefu/blob/main/crates/lib/src/search/mod.rs#L80-L89). This function is called for every kafka record read. + + + +```golang +// golang example +// https://github.com/MAIF/yozefu/blob/main/crates/wasm-blueprints/golang/main.go + +// export matches +func Matches() int32 { + input := FilterInput{} + err := pdk.InputJSON(&input) + param := input.Params[0] + if err != nil { + pdk.SetError(err) + return False + } + if strings.HasSuffix(input.Record.Key, param) { + err = pdk.OutputJSON(FilterResult{Match: true}) + if err != nil { + pdk.SetError(err) + return 1 + } + return 0 + } + _ = pdk.OutputJSON(FilterResult{Match: false}) + return 0 +} +``` + + +### Build it + +Now, it's time to compile it to WebAssembly: + +```bash +# Make is not required but make things easier +make build +``` + +You can also implement and run tests: +```bash +$EDITOR ./tests/parameters.json +$EDITOR ./tests/match.json +$EDITOR ./tests/no-match.json +make test +``` + +Finally import your filter 🎉 +```bash +yozf import-filter 'plugin.wasm' --name "key-ends-with" + +yozf -c my-cluster --topics "my-topic" "from begin where key-ends-with('1234')" +``` \ No newline at end of file diff --git a/docs/search-filter/filter-input-schema..json b/docs/search-filter/filter-input-schema..json new file mode 100644 index 0000000..ba3d49b --- /dev/null +++ b/docs/search-filter/filter-input-schema..json @@ -0,0 +1,72 @@ +{ + "$schema": "http://json-schema.org/draft-07/schema#", + "$id": "https://github.com/MAIF/yozefu/blob/main/crates/wasm-types/docs/search-filter/filter-input-schema.json", + "title": "Schema of FilterInput, see https://github.com/MAIF/yozefu/blob/main/crates/wasm-types/src/lib.rs for more details", + "type": "object", + "properties": { + "record": { + "type": "object", + "properties": { + "value": { + "type":["number", "string", "boolean", "object", "array", "null"] + }, + "key": { + "type":["number", "string", "boolean", "object", "array", "null"], + "examples": ["92334E93-8466-4A9C-B9CA-EBB389CD5F01", "my-key"] + }, + "topic": { + "type": "string", + "minLength": 1, + "description": "The topic to which the record belongs.", + "examples": ["public-french-addresses", "patisserie-delights"] + }, + "timestamp": { + "type": "number", + "minimum" : 0, + "description": "The timestamp of the record in milliseconds since epoch." + }, + "partition": { + "type": "number", + "minimum" : 0, + "description": "The partition number of the record." + }, + "offset": { + "type": "number", + "minimum" : 0, + "description": "The offset of the record within the partition." + }, + "headers": { + "type": "object", + "description": "The headers of the record" + } + }, + "required": [ + "value", + "key", + "topic", + "timestamp", + "partition", + "offset", + "headers" + ] + }, + "params": { + "type": "array", + "description": "The parameters passed to the filter. it accepts numbers or strings.", + "items": { + "oneOf": [ + { + "type": "number" + }, + { + "type": "string" + } + ] + } + } + }, + "required": [ + "record", + "params" + ] +} \ No newline at end of file diff --git a/docs/search-filter/filter-input.json b/docs/search-filter/filter-input.json new file mode 100644 index 0000000..7101db2 --- /dev/null +++ b/docs/search-filter/filter-input.json @@ -0,0 +1,38 @@ +{ + "record": { + "value": { + "type": "Feature", + "geometry": { + "type": "Point", + "coordinates": [ + 4.0215, + 49.211613 + ] + }, + "properties": { + "label": "Rue Franz Kafka 51100 Reims", + "score": 0.6992354545454544, + "id": "51454_3782", + "name": "Rue Franz Kafka", + "postcode": "51100", + "citycode": "51454", + "x": 774438.44, + "y": 6901804.03, + "city": "Reims", + "context": "51, Marne, Grand Est", + "type": "street", + "importance": 0.69159, + "street": "Rue Franz Kafka" + } + }, + "key": "01234", + "topic": "public-french-addresses", + "timestamp": 1732479526752, + "partition": 0, + "offset": 4, + "headers": {} + }, + "params": [ + "1234" + ] +} \ No newline at end of file diff --git a/docs/search-filter/filter-result-schema.json b/docs/search-filter/filter-result-schema.json new file mode 100644 index 0000000..8426c67 --- /dev/null +++ b/docs/search-filter/filter-result-schema.json @@ -0,0 +1,16 @@ +{ + "$schema": "http://json-schema.org/draft-07/schema#", + "$id": "https://github.com/MAIF/yozefu/blob/main/crates/wasm-types/docs/search-filter/filter-result-schema.json", + "title": "Json schema of FilterResult. See https://github.com/MAIF/yozefu/blob/main/crates/lib/src/search/mod.rs#L81 for more details.", + "type": "object", + "properties": { + "match": { + "type": "boolean", + "description": "Whether the record matches the filter or not.", + "examples": [true, false] + } + }, + "required": [ + "match" + ] +} \ No newline at end of file diff --git a/docs/search-filter/filter-result.json b/docs/search-filter/filter-result.json new file mode 100644 index 0000000..6bb9697 --- /dev/null +++ b/docs/search-filter/filter-result.json @@ -0,0 +1,3 @@ +{ + "match": true +} \ No newline at end of file diff --git a/docs/search-filter/schema-dark.svg b/docs/search-filter/schema-dark.svg new file mode 100644 index 0000000..194d088 --- /dev/null +++ b/docs/search-filter/schema-dark.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/docs/search-filter/schema.svg b/docs/search-filter/schema.svg new file mode 100644 index 0000000..52ff00b --- /dev/null +++ b/docs/search-filter/schema.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/docs/themes/README.md b/docs/themes/README.md new file mode 100644 index 0000000..bb31c9a --- /dev/null +++ b/docs/themes/README.md @@ -0,0 +1,24 @@ +# Themes. + +Yozefu includes 3 default themes: `light`, `dark` and `dark-solarized`. + +These themes are defined in a [`themes.json` file](https://github.com/MAIF/yozefu/blob/main/crates/command/themes.json). You can get the location of the file with the command: +```bash +yozf config get themes_file +"/Users/me/Library/Application Support/io.maif.yozefu/themes.json" +``` + +The list of the themes can be obtained with the command: +```bash +yozf config get themes +``` + +🖌️ You are invited to create, update and share new themes. + + +## Using a theme + +You have 2 options to use a theme: + + - Use the `--theme ` flag when you run yozefu. + - You can also edit the `config.json` file with the command `yozf config set /theme solarize-dark` \ No newline at end of file diff --git a/docs/tls/README.md b/docs/tls/README.md new file mode 100644 index 0000000..338c261 --- /dev/null +++ b/docs/tls/README.md @@ -0,0 +1,141 @@ +# TLS Support +

+ + + +

+ + +This page helps you configure TLS settings for different providers. +The steps are always the same: +1. Open the configuration with `yozf configure` +2. Edit the configuration file by adding a new cluster. +3. Save the file and run start the tool `yozf -c my-cluster` + + +## Confluent + +To connect to a confluent kafka cluster: + +1. Open https://confluent.cloud/environments +2. Select your cluster. +3. Click on **Clients** in the left menu. +4. Click on **Set up a new client** +5. Choose a Rust client. +6. Follow the instructions to generate an API key. +7. Open the configuration file: `yozf configure` +8. Edit the configuration: +```json +{ + "clusters": { + "confluent": { + "url_template": "https://confluent.cloud/environments//clusters//topics/{topic}/message-viewer", + "kafka": { + "bootstrap.servers": ".confluent.cloud:9092", + "security.protocol": "SASL_SSL", + "sasl.mechanisms": "PLAIN", + "sasl.username": "", + "sasl.password": "/topics/{topic}?p=-1&s=1&o={offset}#messages", + "kafka": { + "bootstrap.servers": ".any.eu-central-1.mpx.prd.cloud.redpanda.com:9092", + "security.protocol": "SASL_SSL", + "sasl.mechanisms": "PLAIN", + "sasl.username": "", + "sasl.mechanisms": "SCRAM-SHA-256", + "sasl.password": "" + } + } + } +} +``` + + + + + + + + +## Mutual TLS + +For more details about Mutual TLS, refer to the documentation: https://docs.confluent.io/platform/current/kafka/configure-mds/mutual-tls-auth-rbac.html.Certificates. +Please note that, according to [the documentation](https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md), certificates must be in PEM format. +```json +{ + "clusters": { + "acme": { + "url_template": "http://akhq.acme/cluster/{topic}/data?single=true&partition={partition}&offset={offset}", + "kafka": { + "bootstrap.servers": "kafka-1.acme:9092,kafka-2.acme:9092", + "security.protocol": "SSL", + "ssl.ca.location": "path/to/ca-certificate.pem", + "ssl.certificate.location": "path/to/certificate.pem", + "ssl.key.location": "path/to/client.key", + "ssl.key.password": "", + } + } + } +} +``` + + + +## Cloud providers + + +[Contributions are welcomed](https://github.com/MAIF/yozefu/edit/main/docs/tls.md) to improve this page. + + +| Provider | Tested | Documentation | +| --------------------- | ------- | ----------------------------------------------------------------------------------------------------------------------------- | +| Google Cloud Platform | `false` | https://cloud.google.com/managed-service-for-apache-kafka/docs/quickstart#cloud-shell | +| Amazon Web Services | `false` | https://docs.aws.amazon.com/msk/latest/developerguide/produce-consume.html | +| Microsoft Azure | `false` | https://learn.microsoft.com/fr-fr/azure/event-hubs/azure-event-hubs-kafka-overview | +| DigitalOcean | `false` | https://docs.digitalocean.com/products/databases/kafka/how-to/connect/ | +| OVH | `false` | https://help.ovhcloud.com/csm/en-ie-public-cloud-databases-kafka-getting-started?id=kb_article_view&sysparm_article=KB0048944 | \ No newline at end of file diff --git a/docs/try-it.sh b/docs/try-it.sh new file mode 100644 index 0000000..1030a2f --- /dev/null +++ b/docs/try-it.sh @@ -0,0 +1,123 @@ +#!/usr/bin/env bash + + +# This script starts a kafka docker container and publishes data coming from a json API to a topic. +# +# Examples +# +# bash docs/try-it.sh +# bash docs/try-it.sh "Nantes" "json" "public-french-addresses-json" +# bash docs/try-it.sh "Narbonne" "jsonSchema" "public-french-addresses-json-schema" +# bash docs/try-it.sh "Niort" "avro" "public-french-addresses-avro" +# bash docs/try-it.sh "Nancy" "text" "public-french-addresses-text" +# bash docs/try-it.sh "Nimes" "malformed" "public-french-addresses-malformed" + + +set -eo pipefail + + +# When jbang is not installed, +# it uses the kafka-console-producer to produce records +function fallback_produce { + local topic="$1" + local query="$2" + + IFS=$'\n' + result=$(curl -s "https://api-adresse.data.gouv.fr/search/?q=$(echo -n "$query" | sed 's/ /%20/g')&limit=10" | jq -c '.features[]') + for item in $result; do + local key + key=$(uduidgen 2> /dev/null || true) + if [ -z "$key" ]; then + key="$RANDOM" + fi + echo "${key}##$item" | docker compose exec -T kafka /usr/bin/kafka-console-producer --broker-list localhost:9092 --topic "${topic}" --property parse.key="true" --property key.separator="##" & + echo " A new record has been produced." + done + wait +} + +# Make sure these commands are installed +function check_commands_are_installed { + for cmd in docker git bash sed jq; do + if ! command -v $cmd &> /dev/null; then + echo " ❌ This script requires programs to be installed on your machine. Unfortunately, I was not able to find '$cmd'. Install '$cmd' and try again." + exit 1 + fi + done +} + + +check_commands_are_installed + + +if [ "$BASH_SOURCE" = "" ]; then + if [ ! -d /tmp/yozefu ]; then + echo " 🪂 Cloning 'git@github.com:MAIF/yozefu.git' to '/tmp/yozefu'" + git clone git@github.com:MAIF/yozefu.git --depth 1 /tmp/yozefu + else + git -C /tmp/yozefu pull + fi + bash /tmp/yozefu/docs/try-it.sh + exit 0 +fi + +repo=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )/.." &> /dev/null && pwd ) +topic="public-french-addresses" +query="kafka" +type="json" + +if [ $# -ge 1 ]; then + query="$1" +fi + +if [ $# -ge 2 ]; then + type="$2" +fi + +if [ $# -ge 3 ]; then + topic="$3" +fi + +if [ $# -ge 4 ]; then + url="$4" +fi + + +echo " 🐋 Starting kafka" +docker compose -f "${repo}/compose.yml" up kafka schema-registry -d --wait +docker compose -f "${repo}/compose.yml" exec -T kafka \ + /usr/bin/kafka-topics \ + --create --if-not-exists \ + --bootstrap-server localhost:9092 \ + --partitions 1 \ + --topic "${topic}" + +if jbang --version &> /dev/null; then + jbang run ${repo}/docs/schemas/MyProducer.java --type "$type" --topic "$topic" "$query" +else + echo " ℹ️ About to use the default producer 'kafka-console-producer.sh'. Install jbang to create a kafka producer using the schema registry." + echo " 🏡 Searching french addresses matching the query '${query}'" + echo " 📣 About to producing records to topic '${topic}'" + fallback_produce "$topic" "$query" +fi + + +# Invite to try the tool +if ! command -v cargo &> /dev/null +then + if command -v yozf &> /dev/null + then + echo " 🎉 Finally, start the tool" + echo " yozf -c localhost" + else + echo -e " It looks like you haven't installed \033[1myozefu\033[0m yet:" + echo " 1. Go to https://github.com/MAIF/yozefu/releases/latest" + echo " 2. Download the binary that matches your operating system" + echo " 3. curl -L 'https://github.com/MAIF/yozefu/releases/download//yozefu--.tar.gz' | tar xvz" + echo " 4. mv yozefu-* yozf" + echo " 5. Run './yozf -c localhost'" + fi +else + echo " 🎉 Finally, start the tool" + echo " cargo run --manifest-path \"${repo}/Cargo.toml\" -- -c localhost" +fi \ No newline at end of file diff --git a/docs/url-templates/README.md b/docs/url-templates/README.md new file mode 100644 index 0000000..2271bcc --- /dev/null +++ b/docs/url-templates/README.md @@ -0,0 +1,24 @@ +# URL templates to switch to web applications + +In certain situations, you may need to view a Kafka record in a web browser. `yozefu` allows you to do so: select the Kafka record and press the o key (for **o**pen). This will open the corresponding URL in a new browser tab. + +The tool uses a URL template from the configuration file. this template is defined in the `.clusters..url_template` property, where `` is the specific cluster name you're using. + + +This list gives the different URL templates depending on the web application you use: + +| Nom | URL template | +| -------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------ | +| [Confluent](https://confluent.cloud) | https://confluent.cloud/environments/acme-environment/clusters/acme-cluster/topics/{topic}/message-viewer | +| [Control center](https://docs.confluent.io/platform/current/control-center/index.html) | https://control-center.acme/clusters/acme-cluster/management/topics/{topic}/message-viewer | +| [Redpanda Cloud](https://cloud.redpanda.com/) | https://cloud.redpanda.com/clusters/acme-cluster/topics/{topic}?p={partition}&s=1&o={offset}#messages | +| [Redpanda Console](https://www.redpanda.com/redpanda-console-kafka-ui) | https://redpanda-console.acme/topics/elections.electeurs.purge?p={partition}&s=1&o={offset}#messages | +| [AKHQ](https://akhq.io/) | https://akhq.acme/cluster/{topic}/data?single=true&partition={partition}&offset={offset}{offset} | +| [Kafka UI](https://docs.kafka-ui.provectus.io/) | https://kafka-ui.acme/ui/clusters/kafk/all-topics/{topic}/messages?limit=1&seekType=OFFSET&seekTo=0%3A%3A{offset} | +| [Kafdrop](https://github.com/obsidiandynamics/kafdrop) | https://kadrop.acme/topic/{topic}/messages?partition={partition}&offset={offset}&count=1 | +| [Kpow](https://factorhouse.io/kpow) | https://kpow.acme/#/tenant/__kpow_global/cluster/acme-cluster/data/inspect?offset={offset}&topic={topic}&partition={partition} | +| [Kouncil](https://kouncil.io/) | https://kouncil.acme/topics/messages/{topic} | +| [Kafbat UI](https://ui.docs.kafbat.io/) | https://kafbat-ui.acme/ui/clusters/acme-cluster/all-topics/{topic}/messages?limit=1&mode=FROM_OFFSET&offset={offset}&partitions={partition} | + + +At this time, [3 variables can be used](https://github.com/MAIF/yozefu/blob/main/crates/tui/src/component/ui.rs#L312-L318) in the URL template: `{topic}`, `{partition}` and `{offset}`. diff --git a/rust-toolchain.toml b/rust-toolchain.toml new file mode 100644 index 0000000..c9c5a47 --- /dev/null +++ b/rust-toolchain.toml @@ -0,0 +1,11 @@ +[toolchain] +profile = "default" +channel = "stable" +targets = [ + "aarch64-apple-darwin", + "x86_64-apple-darwin", + "x86_64-unknown-linux-gnu", + "aarch64-unknown-linux-gnu", + "x86_64-pc-windows-gnu", + "x86_64-pc-windows-msvc" +] \ No newline at end of file diff --git a/typos.toml b/typos.toml new file mode 100644 index 0000000..372dc95 --- /dev/null +++ b/typos.toml @@ -0,0 +1,2 @@ +[default.extend-words] +ratatui = "ratatui"