idproject
int64 7.76k
28.8M
| issuekey
int64 675k
128M
| created
stringlengths 19
32
| title
stringlengths 4
226
| description
stringlengths 2
154k
⌀ | storypoints
float64 0
300
⌀ |
---|---|---|---|---|---|
7,603,319 | 18,696,124 |
2019-02-28 21:23:07.690
|
Create a connector for Snowflake in the UI
|
### Problem to solve
(Summarize the problem we are trying to solve in the form of we need to do [A], so that [B] can [C])
Currently, if we want to use snowflake as a target, we can run the ETL, but a user cannot use the UI to analyze data
### Target audience
(For whom are we doing this? Include a persona)
Anyone replicating the tutorials
### Further details
(Include use cases, benefits, and/or goals)
Have a full working demo with snowflake as a target
### Proposal
(How are we going to solve the problem? Try to include the user journey)
Create a connector for Snowflake - @iroussos, let's break down the problem if needed
### What does success look like, and how can we measure that?
(Define both the success metrics and acceptance criteria. Note that success metrics indicate the desired business outcomes, while acceptance criteria indicate when the solution is working correctly. If there is no way to measure success, link to an issue that will implement a way to measure this)
AC: We can run each demo e2e with target snowflake and play with the data in the UI
### Links / references
_Please note that this was taken from GitLab, to be changed accordingly_
| 4 |
7,603,319 | 18,266,934 |
2019-02-15 13:57:43.532
|
Planning/Exploration Sprint - should Meltano offer a hosted sandbox experience?
|
### Full Opportunity Assessment
@valexieva please add link here once written: --> both links below
### Problem(s) to solve
Our current product adoption path assumes users launch Meltano in their local development environment (their computer), however due to the size of many data sets this can cause issues with memory, performance, etc. which can limit user adoption and recurring MAU
However, users also may be using Meltano on data that we do not want to responsible for hosting (PII, GDPR, etc.) so we also need to think about guiding them to use a production environment that makes sense for their needs.
### Proposed Solution (to Validate)
Offer the user a way to use Meltano while hosting their data somewhere that can handle a big data set.
Will explore pros/cons of different options, including:
* Self-hosting Meltano on the user's existing infrastructure
* Self-hosting Meltano on a one-click install (e.g. Bitnami, Dreamhost, Amazon, etc.)
* Meltano hosted sandbox
### Tasks to Complete Planning/Exploration Sprint
* [x] Write an opportunity assessment for a hosted sandbox experience (template here: https://docs.google.com/document/d/1rKEihpSxzC_a499xzbuIh9Jd-MfteOaUTMVqQJBzAtA/edit# - go / no-go + recommendation
* [x] Write an opportunity assessment for docker (template here: https://docs.google.com/document/d/1OG2HzICuj0v2C1itznPoW4mI5KtroSDKSmDIPxet0OE/edit# - go / no-go + recommendation
* [x] Collect and discuss advantages / disadvantages of all options in the context of the product
* [ ] ~~Architectural design proposal, including development environment/ deployment to production~~
* [ ] ~~Feasibility discussion~~
* [ ] ~~Lightweight development design for a 1 week sprint experiment~~
* [x] Go / no-go for immediate action versus place in the backlog. **Decision is: NO GO**, we will execute on our Docker image for now and revisit later
| 4 |
7,603,319 | 18,053,807 |
2019-02-07 20:28:14.617
|
to be reused for something else
|
TBD
| 2 |
7,603,319 | 17,667,095 |
2019-01-25 15:57:36.900
|
Add workflow for custom taps and targets
|
### Problem to solve
_Summarize the problem we are trying to solve in the form of we need to do [A], so that [B] can [C]_
We currently don't support many taps and targets and this poses risk to adoption. We would like to have a very clear explanation / workflow for anyone willing to contribute
### Target audience
(For whom are we doing this? Include a persona)
Developers
### Further details
(Include use cases, benefits, and/or goals)
By adding more taps and targets, we hope to be able to increase adoption by allowing companies to bring many different data sources in 1.
Developing taps and targets in Singer is painful and we believe that by making this process smoother, we can improve the Singer contributing community, which is currently a roadblock to adoption for us.
### Proposal
(How are we going to solve the problem? Try to include the user journey)
- [x] Add custom switch to `meltano add` - we should be able to add any tap/target bypassing discovery.yml from the CLOI
- [x] Document steps required to create a custom tap and target
### What does success look like, and how can we measure that?
(Define both the success metrics and acceptance criteria. Note that success metrics indicate the desired business outcomes, while acceptance criteria indicate when the solution is working correctly. If there is no way to measure success, link to an issue that will implement a way to measure this)
### Links / references
_Please note that this was taken from GitLab, to be changed accordingly_
| 1 |
7,603,319 | 17,331,949 |
2019-01-14 16:50:41.819
|
Silence pip unless there is an error
|
As per the 2018-01-14 demo, this was a feedback:
> Pip version 10.0.1 red warning.
> Micael: Should we silence pip?
> Micael: We can either upgrade pip (I don’t like that) or silence the ouput unless there is an error (I like that)
I think we should simple silence pip unless there is a problem.
Make the error YELLOW and not RED when it's a warning.
| 1 |
7,603,319 | 17,268,497 |
2019-01-11 13:25:24.817
|
Remove `dbt` as a dependency in setup.py
|
Meltano doesn't need **dbt** to be installed as a dependency.
Let's remove it.
| 1 |
7,603,319 | 17,139,395 |
2019-01-07 16:09:06.349
|
Settings with SQLite
|
- [x] Refactor the frontend of the setting page to accept connections for SQLite.
| 1 |
7,603,319 | 14,522,544 |
2018-09-27 13:53:09.751
|
Snowflake Target
|
Because we are starting to move in the direction of using the singer taps and target protocol we will need a snowflake target.
cc @iroussos
| 13 |
7,603,319 | 98,557,810 |
2021-12-06 22:40:00.303
|
Consider a private meltano managed `.meltano/dbt/profiles.yml` for dbt
|
## Background
Currently Meltano creates a not-very robust `profiles.yml` file which attempts to be "one size fits all" but it only supports a few different config options and has several magic environment variables it expects. This is a barrier to new users onboarding and frequently we get "not declared" errors for those environment variables being missing from runtime context.
Pros of status quo:
1. Gives an out-of-box dbt experience for three different generic database configs: postgres, redshift, and snowflake
- https://gitlab.com/meltano/files-dbt/-/blob/master/bundle/transform/profile/profiles.yml
2. Doesn't require a user to understand the `profiles.yml` format.
Cons of status quo:
1. Most users will end up deleting the extra profiles or modifying the file to make it their own. (Which feels wrong, like you are breaking the connection to Meltano.)
1. There's no support for other databases like Spark or Redshift.
1. Manual changes could get overridden if the files bundle is reinstalled.
1. Following from point 1, it's not clear at all if overriding/modifying the built-in `profiles.yml` is supported - or if there are other consequences of doing so.
## Proposal
1. Similarly to how Meltano pre-creates and manages config.json files and catalog.json files for taps, Meltano could auto-generate a hidden `profiles.yml` file internally to the `.meltano` directory.
1. Meltano can populate values from config in `meltano.yml`, rather than requiring special environment variables to be set by the user.
1. Meltano can pick what kind of dbt adapter to use based on the environment's `primary_loader` config (not yet built).
1. As users get more mature in their usage of dbt, they may _optionally_ provide a path to their own generated profiles.yml file. This could be an optional config option for the `dbt` transformer and/or of the meltano environment itself.
1. The user can optionally provide a profile name for dbt to use - as a config option for the `dbt` transformer.
- If not set, the dbt profile target can default to the environment name (`prod`, `userdev`, etc.).
1. Since the hidden file would not be version controlled, it can be adapted without breaking existing code. For instance, we could add handling for a Spark profile and we wouldn't need users to reinstall `files-dbt` bundle.
| 8 |
7,603,319 | 98,540,152 |
2021-12-06 17:05:45.966
|
Document pyenv/poetry use in contributor docs
|
Our getting started guide for contributors isn't super cohesive in how the setup steps for contributing to meltano, the meltano api, or the meltano ui are described, and what the expected dev environment looks like.
1. For example in the "setting up your environment" section we have folks clone/poetry install https://meltano.com/docs/contributor-guide.html#setting-up-your-environment
2. The api dev section makes no further mention of poetry and has you do a vanilla `meltano init`
3. Similar the UI dev sections also makes no further mention of poetry.
Would it be useful to maybe to include a preamble on working with pyenv/poetry - and then providing general contributor, api, ui dev docs that take that into account? A lot of folks will already have experience with things like pyenv/virtualenv/poetry but it might still be nice to touch on them and clarify their potential use for managing your dev environments.
| 1 |
7,603,319 | 98,479,511 |
2021-12-05 18:03:20.565
|
meltano run ci/cd pipeline project
|
Something similar to the `meltano/demo-project` running as part of our CI/CD suite but using `meltano run` would be handy. I think initially, it would probably be wise to make it an optional stage (i.e. `allow_failure: true`) - so that failures on that stage don't block our entire pipeline. Another option might be to initially have this be a scheduled pipeline - https://docs.gitlab.com/ee/ci/pipelines/schedules.html
| 4 |
7,603,319 | 98,435,750 |
2021-12-03 21:57:53.767
|
Command `dbt:run` should work without initializing extra environment variables
|
When running `meltano elt tap target --transform=run` - things "just work" without little additional work if you're using one of the existing transform packages like in the gitlab demo project in a pipeline.
However, running `meltano invoke dbt:run` doesn't work because there are still several env vars we have to set. The big ones (like DBT_TARGET_SCHEMA) are covered very briefly in https://meltano.com/docs/transforms.html#running-a-transform-in-meltano but there's a lot more you need to potentially set if you're trying to use it from the env, namely: PG_DATABASE|PASSWORD|USERNAME|PORT|etc.
At first, I couldn't tell if those were all env vars we were just setting proactively every time you run `meltano elt` (because we can infer what we need there and what the target context is), or if there was another way I should configure this. i.e. is there a generated config file I should just reference? All the config files I found just seem to reference env vars though.
## Environment Variables currently expected by our bundled `profiles.yml`
| Variable | Meaning | Maps to |
| ------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `DBT_TARGET` | > This is the default target your dbt project will use. It must be one of the targets you define in your profile. Commonly it is set to `dev`. | Defaults to Meltano's `$MELTANO_LOAD__DIALECT`. With the `profiles.yml` currently packaged in the dbt file bundle, this means one of `snowflake`, `postgres` or `bigquery`. Like @tayloramurphy mentions in https://gitlab.com/meltano/meltano/-/issues/3104#note_804770702, this is confusing and not standard practice in dbt projects. |
| `DBT_TARGET_SCHEMA` | > The default schema that dbt will build objects in. | Defaults to `analytics`. |
| `SF_*` | Snowflake connection parameters. | These [leverage the `env_aliases`](https://gitlab.com/meltano/meltano/-/blob/544f7128b3efd783efe7ac7b7c109e80ef9f086c/src/meltano/core/bundle/discovery.yml#L2551-2578) injected at runtime by Meltano. So, they get the same values as `TARGET_SNOWFLAKE_*`. |
| `SF_ROLE` | Snowflake Role. | This one doesn't use `env_aliases` so I'm not sure if it gets injected... |
| `PG_*` | PostgreSQL connection parameters. | These [leverage the `env_aliases`](https://gitlab.com/meltano/meltano/-/blob/544f7128b3efd783efe7ac7b7c109e80ef9f086c/src/meltano/core/bundle/discovery.yml#L2299-2333) injected at runtime by Meltano. So, they get the same values as `TARGET_POSTGRES_*`. |
| `TARGET_BIGQUERY_*` | BigQuery connection parameters. | These use directly the same values as `TARGET_BIGQUERY_*`. |
| 4 |
7,603,319 | 98,429,315 |
2021-12-03 18:09:15.750
|
tap-slack extractor token config setting mislabeled
|
The [tap-slack setting](https://github.com/Mashey/tap-slack#setup) requires the `token` key but we have it labeled `api_token` so configuration doesnt work through the CLI.
| 1 |
7,603,319 | 98,425,476 |
2021-12-03 16:37:59.441
|
Document Nested Select Examples
|
Based on https://gitlab.com/meltano/sdk/-/issues/285#note_750983833 where incorrectly configured select criteria for a complex nested case looked like an SDK bug to the user. We should update [the documentation](https://meltano.com/docs/command-line-interface.html#select) to show the example that @edgarrmondragon provided. I believe the solution was that you need to include the highest level attribute before you can select nested attributes (correct me if I'm wrong).
| 1 |
7,603,319 | 98,375,509 |
2021-12-02 20:58:36.452
|
meltano run improvements for closer elt parity
|
For `meltano run` to be a full replacement of `meltano elt` there's a few small additions that we'll probably need to consider. This originally pop'd up in https://gitlab.com/meltano/meltano/-/issues/2301#note_750941505 during the initial `meltano run` build.
1. ~~Support for incremental jobs~~ (now logged as #3130):
1. Job ID support via Job ID prefix sourced from a CLI flag or based on the activated meltano env. (https://gitlab.com/meltano/meltano/-/issues/2301#note_750941505)
1. Top-level `--full-refresh` to ignore saved state and do a full refresh. (https://gitlab.com/meltano/meltano/-/merge_requests/2432#note_757893658)
1. Top-level `--no-state-update` to not save state. (https://gitlab.com/meltano/meltano/-/merge_requests/2432#note_757893658)
1. Top-level `--force` to ignore locks or indications of already-running jobs. (https://gitlab.com/meltano/meltano/-/merge_requests/2432#note_757893658)
1. A "dry run" option ala `meltano run --dry-run ...` (https://gitlab.com/meltano/meltano/-/issues/2301#note_750941505)
Note: Behind the scenes - cleaning up the ELBContext -> ELTContext relationship and what singer plugins expect probably needs to be part of the Job support work.
| 12 |
7,603,319 | 98,284,875 |
2021-12-02 02:09:43.551
|
Document MacOS specific configuration step for contributing to the UI/API
|
On MacOS BigSur (and up?) port 5000 is [used by Airplay/Control Center](https://developer.apple.com/forums/thread/682332). For new users setting up a development environment that can cause a somewhat hard to troubleshoot issue because `meltano ui` won't actually be able to start. It's easy enough to work around by [setting alternate bind ports](https://meltano.com/docs/settings.html#ui-bind-port), so we should just explicitly document this in the contributor guide - and flag it for MacOS users.
Long term we have https://gitlab.com/meltano/meltano/-/issues/3088 to try and auto-detect/warn if the port is unaivable, and to change the default port for new projects.
| 1 |
7,603,319 | 98,280,413 |
2021-12-01 23:35:50.436
|
Clarify venv usage by Meltano when inheriting with an environment
|
Had a user ask today whether or not Meltano re-installed a plugin in a separate venv when a [Meltano Environment](https://meltano.com/docs/environments.html#inheritance) is used. I wasn't sure of the answer and the docs weren't clear.
@edgarrmondragon can you clarify this and make an MR to the docs?
| 1 |
7,603,319 | 98,278,569 |
2021-12-01 22:21:48.876
|
flakehell install failures
|
Both @alex1126 and I are running into an issue where flakehell is failing to install while setting up the pre-commit hooks documented in the [contributor guide](https://meltano.com/docs/contributor-guide.html#setting-up-your-environment):
```shell
(melty-3.8.12) ➜ meltano git:(master) poetry run pre-commit install --install-hooks
pre-commit installed at .git/hooks/pre-commit
[INFO] Installing environment for https://github.com/life4/flakehell.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
An unexpected error has occurred: CalledProcessError: command: ('/Users/syn/.cache/pre-commit/repo72kapjci/py_env-python3.8/bin/python', '-mpip', 'install', '.')
return code: 1
expected return code: 0
stdout:
Processing /Users/syn/.cache/pre-commit/repo72kapjci
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Installing backend dependencies: started
Installing backend dependencies: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'error'
stderr:
ERROR: Command errored out with exit status 1:
command: /Users/syn/.cache/pre-commit/repo72kapjci/py_env-python3.8/bin/python /Users/syn/.cache/pre-commit/repo72kapjci/py_env-python3.8/lib/python3.8/site-packages/pip/_vendor/pep517/in_process/_in_process.py prepare_metadata_for_build_wheel /var/folders/47/sydnn1xj5vv6krsn8n4krgjw0000gn/T/tmpet2njlov
cwd: /Users/syn/.cache/pre-commit/repo72kapjci
Complete output (36 lines):
Traceback (most recent call last):
File "/Users/syn/.cache/pre-commit/repo72kapjci/py_env-python3.8/lib/python3.8/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
main()
File "/Users/syn/.cache/pre-commit/repo72kapjci/py_env-python3.8/lib/python3.8/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/Users/syn/.cache/pre-commit/repo72kapjci/py_env-python3.8/lib/python3.8/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 164, in prepare_metadata_for_build_wheel
return hook(metadata_directory, config_settings)
File "/private/var/folders/47/sydnn1xj5vv6krsn8n4krgjw0000gn/T/pip-build-env-znhipti1/overlay/lib/python3.8/site-packages/flit_core/buildapi.py", line 49, in prepare_metadata_for_build_wheel
metadata = make_metadata(module, ini_info)
File "/private/var/folders/47/sydnn1xj5vv6krsn8n4krgjw0000gn/T/pip-build-env-znhipti1/overlay/lib/python3.8/site-packages/flit_core/common.py", line 396, in make_metadata
md_dict.update(get_info_from_module(module, ini_info.dynamic_metadata))
File "/private/var/folders/47/sydnn1xj5vv6krsn8n4krgjw0000gn/T/pip-build-env-znhipti1/overlay/lib/python3.8/site-packages/flit_core/common.py", line 193, in get_info_from_module
docstring, version = get_docstring_and_version_via_import(target)
File "/private/var/folders/47/sydnn1xj5vv6krsn8n4krgjw0000gn/T/pip-build-env-znhipti1/overlay/lib/python3.8/site-packages/flit_core/common.py", line 169, in get_docstring_and_version_via_import
m = sl.load_module()
File "<frozen importlib._bootstrap_external>", line 522, in _check_name_wrapper
File "<frozen importlib._bootstrap_external>", line 1022, in load_module
File "<frozen importlib._bootstrap_external>", line 847, in load_module
File "<frozen importlib._bootstrap>", line 265, in _load_module_shim
File "<frozen importlib._bootstrap>", line 702, in _load
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 843, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/Users/syn/.cache/pre-commit/repo72kapjci/flakehell/__init__.py", line 5, in <module>
from ._cli import entrypoint, flake8_entrypoint
File "/Users/syn/.cache/pre-commit/repo72kapjci/flakehell/_cli.py", line 9, in <module>
from .commands import COMMANDS
File "/Users/syn/.cache/pre-commit/repo72kapjci/flakehell/commands/__init__.py", line 5, in <module>
from ._baseline import baseline_command
File "/Users/syn/.cache/pre-commit/repo72kapjci/flakehell/commands/_baseline.py", line 6, in <module>
from .._patched import FlakeHellApplication
File "/Users/syn/.cache/pre-commit/repo72kapjci/flakehell/_patched/__init__.py", line 2, in <module>
from ._app import FlakeHellApplication
File "/Users/syn/.cache/pre-commit/repo72kapjci/flakehell/_patched/_app.py", line 10, in <module>
from flake8.options.config import MergedConfigParser, get_local_plugins
ImportError: cannot import name 'MergedConfigParser' from 'flake8.options.config' (/private/var/folders/47/sydnn1xj5vv6krsn8n4krgjw0000gn/T/pip-build-env-znhipti1/normal/lib/python3.8/site-packages/flake8/options/config.py)
```
This seems to be related to https://github.com/flakehell/flakehell/issues/22 but the versions don't actually match up:
```
(melty-3.8.12) ➜ meltano git:(master) poetry run flakehell --version
FlakeHell 0.7.1
Flake8 3.9.2
For plugins versions use flakehell plugins
```
Digging some more, @edgarrmondragon found that pre-commit actually creates it own venv's and doesn't take into account pyproject.toml - and so the latest versions of flakehell/flake8 are installed - which then trigger https://github.com/flakehell/flakehell/issues/22.
| 0 |
7,603,319 | 98,196,140 |
2021-11-30 19:32:18.459
|
Auto-detect UI port availability and possibly switch default for new projects
|
On MacOS BigSur (and up?) port 5000 is [used by Airplay/Control Center](https://developer.apple.com/forums/thread/682332). For new users setting up a development environment that can cause a somewhat hard to troubleshoot issue because `meltano ui` won't actually be able to start. It's easy enough to work around by [setting alternate bind ports](https://meltano.com/docs/settings.html#ui-bind-port), so we could just explicitly document this in the contributor guide - and flagging it for MacOS users.
Would it be worth considering changing the default api port from 5000 (and 8080 for the dev ui) to something like 8080 (and 8181 for the dev ui)?
| 4 |
7,603,319 | 98,122,488 |
2021-11-29 19:38:35.753
|
Update pinned version of target-bigquery
|
Latest version of adswerve/target-bigquery is [0.11.3](https://github.com/adswerve/target-bigquery/releases), but the plugin in `discovery.yml` is pinned to an earlier version: `git+https://github.com/adswerve/target-bigquery.git@v0.10.2`.
There have been some important bug fixes since that, like https://github.com/adswerve/target-bigquery/issues/9.
cc @pnadolny13 @tayloramurphy
| 1 |
7,603,319 | 97,586,046 |
2021-11-18 22:29:16.071
|
Meltano version bumps and changelog grooming should not require AJ review
|
Logging this issue to remove myself (AJ) from the weekly release process.
Related issue for SDK: https://gitlab.com/meltano/sdk/-/issues/277
| 2 |
7,603,319 | 97,546,181 |
2021-11-18 10:53:03.304
|
Document how STATE is handled in greater detail
|
The specifics around handling of 'incomplete state' (state from previous failed jobs) is not included in the [relevant section of the docs.](https://meltano.com/docs/integration.html#incremental-replication-state).
Thread from [Slack](https://meltano.slack.com/archives/CMN8HELB0/p1637186511109700):
> I should know this already, but... If a loader fails in an ELT job, what happens to the data and the state? I reviewed the Meltano ELT reference architecture and it does not say, and I have not looked at the source. Is the main Meltano process actually sitting in the middle and watching the messages from extractor stdout and sending them to loader stdin, and watching stdout of the target for that state to know that the target has processed, and stored, records up to that state? I've had that assumption all along, but didn't really know.
> I know the feeling. It's one of those things that you really want to know is solid and understand, but I've also often had the feeling I didn't really understand what would happen in exceptional situations.
Taking the code is truth approach - these are the relevant places I think:
https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/plugin/singer/target.py#L21
https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/runner/singer.py#L90
As each STATE message flows through, meltano (or optionally delegated to the target it seems) writes the state.
Would be great to have one of the @meltano-engineering guys jump in and correct me, or even some links to detailed design docs! (edited)
| 2 |
7,603,319 | 98,548,093 |
2021-11-17 21:12:43.059
|
Allow users to name a default Environment in `meltano.yml`
|
## Summary
[//]: # (Concisely summarize the feature you are proposing.)
Let users set the name of the default environment in `meltano.yml`.
## Proposed benefits
[//]: # (Concisely summarize the benefits this feature would bring to yourself and other users.)
Users don't need to pass `--environment` to every command or set the `MELTANO_ENVIRONMENT` variable.
## Proposal details
[//]: # (In as much detail as you are able, describe the feature you'd like to build or would like to see built.)
```yaml
version: 1
default_environment: dev
environments:
- name: dev
...
- name: prod
...
```
means commands will use `dev` if `--environment` is not passed.
## Best reasons not to build
[//]: # (Will this negatively affect any existing functionality? Do you anticipate any breaking changes versus what may already be working today? Make the counter-argument to your proposal here.)
Can confuse users about which environment will be activated if they use `--environment` and `default_environment` at the same time, but that can easily addressed in the docs and clearly giving precedence to the former.
| 4 |
7,603,319 | 97,509,233 |
2021-11-17 20:53:23.932
|
Support a --log-format short cut
|
Per @rabidaudio in https://gitlab.com/meltano/meltano/-/merge_requests/2395#note_731245620
> Would it be possible to configure the log format from CLI args? It seems like the way to configure it as designed now is through the log handler files. It seems clunky to ask users to turn on structured logs everywhere just so they show up nicely in Dagster. I suppose our Dagster pipelines could generate an alternate `logging.yaml` file and pass with `--log-config`, would this be the recommended way?
We could definitely add a basic `--log-format=[json|keyvalue|uncolored-console]` flag that switches meltano to use one of our other built-in formatters for the console logger *if* there's no logging config present.
It would be a nice shortcut for scenarios where folks don't want to supply a more elaborate logging config, and really just want to be able to switch from a human-friendly format to a machine-friendly format.
| 2 |
7,603,319 | 97,423,721 |
2021-11-16 13:22:57.314
|
Support `meltano shell` command to spawn terminal in plugin context
|
## Change Log
- 2021-11-17: Added feedback from @edgarrmondragon and Docker ideas.
### Problem to solve
In order to improve productivity of developers using plugins within a Meltano Project, we could support a `meltano shell` command that spawns a terminal session inside the plugins context. This enables users to benefit from Meltanos config, virtual-env and environment management features in a simple and intuitive way (i.e. without having to prefix each plugin invocation with `meltano invoke`).
### Target audience
Developers, Data Engineers, Data Scientists and Analytics Engineers working within a Meltano Project.
### Further details
This would compliment the recently added `--environments` [feature](https://meltano.com/docs/environments.html#environments), allowing users to get on with _using_ wrapped tools without having to worry about configuration, virtual-env and dev/prod environment differences. It also cuts down on redundant setup/teardown operations on each `meltano invoke` invocation (e.g. Airflow `initdb` [here](https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/plugin/airflow.py#L108)), and allows us to set up other convenience features like autocomplete etc. in the shell context. Users may wish to have control of the development environment for each plugin, including adding additional tools for tests, linting etc. (e.g. SQLFluff with dbt).
This may also be an opportunity to support Docker containers for development. `meltano shell` may just as easily implement an equivalent of `docker run -it fishtownanalytics/dbt` as `source /path/to/dbt/venv/bin/activate`. This may reduce the impact of cross-platform tooling support and configuration differences.
### Proposal
Example workflow for adding a new data integration using Meltano, dbt and Airflow:
```shell
$ git checkout -b 2345_add_gitlab_to_hub_pipeline
# find a tap that matches the source I want to integrate
$ meltano discover extractors
> tap-gitlab
> ...
# add and configure the tap
$ meltano add extractor tap-gitlab
$ meltano config tap-gitlab set projects "meltano/meltano meltano/sdk"
$ meltano config tap-gitlab set start_date 2021-10-01T00:00:00Z
# now lets create a schedule
$ meltano schedule gitlab-to-postgres tap-gitlab target-postgres @daily
# and run a sample extract in my development environment, so dbt has data
$ meltano --environment=dev_kp elt tap-gitlab target-postgres --job_id=gitlab-to-postgres
# with a new extractor added and data extracted,
# lets update our dbt models for the new source...
# activate a console within the 'dbt' virtual env created by Meltano Project,
# using secrets/env vars from my personal dev environment
$ meltano shell dbt --environment=dev_kp
# luckily a postgres-compatible dbt package exists for the tap I just added 😅
$ nano packages.yml
> packages:
- package: dbt-labs/gitlab
version: 0.7.0
$ dbt deps
# run the new package using creds from my dev_kp environment
$ dbt run --select gitlab
# done (optimistically)
$ exit
> back in parent project context
# finally, lets update my hub DAG in Meltano Project (doesn't exist yet - see [#2920](https://gitlab.com/meltano/meltano/-/issues/2920))
$ nano meltano.yml
> ...
dags:
meltano_hub:
interval: @daily
steps:
- name: gitlab
cmd: 'meltano elt tap-gitlab target-postgres --job_id=gitlab-to-postgres'
retries: 2
- name: google_analytics
cmd: 'meltano elt tap-google-analytics target-postgres --job_id=google-analytics-to-postgres'
- name: product_mart
cmd: 'meltano invoke dbt run --select models/marts/product'
depends_on:
- gitlab
- google_analytics
...
# wonder if the new DAG actually works?
# start a terminal session in the Airflow context
$ meltano shell airflow --environment=dev_kp
# Note: this could launch a local instance of `airflow webserver`
# for developer convenience 🤔
$ airflow dags list
> meltano_hub
$ airflow dags trigger meltano_hub
> runs DAG locally using config specified in dev_kp environment
# *goes and checks the Airflow UI on http://localhost:8080 and the test warehouse for data*
# done
$ exit
# lets see what the team thinks
$ git commit -am "added gitlab integration"
$ git push
$ glab mr create
# off it goes to CI ✉️
# profit ☕️
```
### What does success look like, and how can we measure that?
Users can work productively in a plugins context facilitated (and not obstructed) by Meltano Project. New users to a Project do not need to worry about the mechanics of installation and configuration; onboarding to an existing Project is as easy as `git clone` and `meltano shell <plugin name>` 🚀
### Regression test
(Ensure the feature doesn't cause any regressions)
- [ ] Write adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
### Links / references
- [#2920](https://gitlab.com/meltano/meltano/-/issues/2920) Meltano should output a DAG-like data structure with individual pipeline components
- [&82](https://gitlab.com/groups/meltano/-/epics/82) Improved dbt integration
- [#2210](https://gitlab.com/meltano/meltano/-/issues/2205) Add `docker_image` property to support non-Python/pip plugins
_Please note that this was taken from GitLab, to be changed accordingly_
## Option: Alternative framing around `meltano environment activate ...`
We should consider also if this could be an environment-level invocation, such as `meltano environment activate`. Which would be a much more robust version of sourcing the `.env` variables, but able to step into a context where all invocations are already contextualized and able to be invoked directly.
If an environment-level action, we get around issues of having to step separately into dbt and sqlfluff context separately, because all of the context for both would be set at time of activating the environment.
| 12 |
7,603,319 | 97,392,485 |
2021-11-16 01:58:12.557
|
State lost after ETL exception
|
### What is the current *bug* behavior?
The incremental state seems to have been completely lost, forcing a full re-sync of the DB.
### What is the expected *correct* behavior?
Meltano should always use the stored state.
### Steps to reproduce
I don't have a repro I'm afraid.
### Relevant logs and/or screenshots
```
# meltano --version
meltano, version 1.69.0
```
Running in a Docker container under GKE.
The last logs I got were:
```
meltano | Loading failed (2): CRITICAL ['Traceback (most recent call last):
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/urllib3/connectionpool.py", line 445, in _make_request
six.raise_from(e, None)
', ' File "<string>", line 3, in raise_from
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/urllib3/connectionpool.py", line 440, in _make_request
httplib_response = conn.getresponse()
', ' File "/usr/local/lib/python3.8/http/client.py", line 1347, in getresponse
response.begin()
', ' File "/usr/local/lib/python3.8/http/client.py", line 307, in begin
version, status, reason = self._read_status()
', ' File "/usr/local/lib/python3.8/http/client.py", line 268, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
', ' File "/usr/local/lib/python3.8/socket.py", line 669, in readinto
return self._sock.recv_into(b)
', ' File "/usr/local/lib/python3.8/ssl.py", line 1241, in recv_into
return self.read(nbytes, buffer)
', ' File "/usr/local/lib/python3.8/ssl.py", line 1099, in read
return self._sslobj.read(len, buffer)
', 'socket.timeout: The read operation timed out
', '
During handling of the above exception, another exception occurred:
', 'Traceback (most recent call last):
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/urllib3/connectionpool.py", line 755, in urlopen
retries = retries.increment(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/urllib3/util/retry.py", line 532, in increment
raise six.reraise(type(error), error, _stacktrace)
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/urllib3/packages/six.py", line 770, in reraise
raise value
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/urllib3/connectionpool.py", line 699, in urlopen
httplib_response = self._make_request(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/urllib3/connectionpool.py", line 447, in _make_request
self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/urllib3/connectionpool.py", line 336, in _raise_timeout
raise ReadTimeoutError(
', "urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='bigquery.googleapis.com', port=443): Read timed out. (read timeout=60)
", '
During handling of the above exception, another exception occurred:
', 'Traceback (most recent call last):
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/target_bigquery/__init__.py", line 92, in main
for state in state_iterator:
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/target_bigquery/process.py", line 55, in process
for s in handler.on_stream_end():
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/target_bigquery/processhandler.py", line 261, in on_stream_end
self._do_temp_table_based_load(rows)
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/target_bigquery/processhandler.py", line 168, in _do_temp_table_based_load
raise e
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/target_bigquery/processhandler.py", line 136, in _do_temp_table_based_load
job = self._load_to_bq(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/target_bigquery/processhandler.py", line 215, in _load_to_bq
load_job = client.load_table_from_file(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 1760, in load_table_from_file
response = self._do_resumable_upload(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 2085, in _do_resumable_upload
upload, transport = self._initiate_resumable_upload(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 2127, in _initiate_resumable_upload
upload.initiate(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/google/resumable_media/requests/upload.py", line 345, in initiate
response = _helpers.http_request(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/google/resumable_media/requests/_helpers.py", line 136, in http_request
return _helpers.wait_and_retry(func, RequestsMixin._get_status_code, retry_strategy)
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/google/resumable_media/_helpers.py", line 150, in wait_and_retry
response = func()
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/google/auth/transport/requests.py", line 480, in request
response = super(AuthorizedSession, self).request(
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
', ' File "/project/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/requests/adapters.py", line 529, in send
raise ReadTimeout(e, request=request)
', "requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='bigquery.googleapis.com', port=443): Read timed out. (read timeout=60)
"]
meltano | ELT could not be completed: Loader failed
ELT could not be completed: Loader failed
```
It seems the proximal cause is a timeout in BigQuery, but this appears to have caused Meltano to corrupt the state DB.
The first logs from the next ELT run are:
```
meltano | Running extract & load...
meltano | Found state from 2021-11-15 23:15:20.194728.
meltano | No state was found, complete import.
tap-mysql--incremental | time=2021-11-15 23:45:21 name=tap_mysql level=INFO message=Server Parameters: version: 5.7.34-google, wait_timeout: 28000, innodb_lock_wait_timeout: 50, max_allowed_packet: 33554432, interactive_timeout: 28800
tap-mysql--incremental | time=2021-11-15 23:45:21 name=tap_mysql level=INFO message=Server SSL Parameters(blank means SSL is not active): [ssl_version: ], [ssl_cipher: ]
tap-mysql--incremental | time=2021-11-15 23:45:21 name=tap_mysql level=WARNING message=Columns {'id'} are primary keys but were not selected. Adding them.
tap-mysql--incremental | time=2021-11-15 23:45:22 name=tap_mysql level=INFO message=Beginning sync for InnoDB table qwil.admin_comments_comment
tap-mysql--incremental | time=2021-11-15 23:45:22 name=tap_mysql level=INFO message=Stream qwil-admin_comments_comment is using full table replication
tap-mysql--incremental | time=2021-11-15 23:45:22 name=tap_mysql level=INFO message=Detected auto-incrementing primary key(s) - will replicate incrementally
tap-mysql--incremental | time=2021-11-15 23:45:22 name=tap_mysql level=INFO message=Running SELECT `content_type_id`,`object_id`,`time`,`comment`,`id`,`user_id` FROM `qwil`.`admin_comments_comment` WHERE `id` <= 2789 ORDER BY `id` ASC
tap-mysql--incremental | /project/.meltano/extractors/tap-mysql--incremental/venv/lib/python3.8/site-packages/pymysql/connections.py:1077: UserWarning: Previous unbuffered result was left incomplete
tap-mysql--incremental | warnings.warn("Previous unbuffered result was left incomplete")
```
The "previous unbuffered result was left incomplete" log sounds pretty suspicious, though it is coming from the `tap-mysql` and not meltano so perhaps it's a red herring. However you can clearly see that Meltano is a bit confused with the "found state" => "no state found" logs.
### Possible fixes
Is it possible that Meltano isn't flushing its writes to the state DB, such that on a hard crash it is corrupting the DB?
### Further regression test
| 4 |
7,603,319 | 97,038,222 |
2021-11-09 19:20:44.213
|
Dumping state as part of an elt call should pull directly from DB
|
In https://meltano.slack.com/archives/C01PKLU5D1R/p1636061054200300 while troubleshooting a state issue related to https://gitlab.com/meltano/meltano/-/issues/3045 folks ran into issues with the `--dump=state` command failing. This was a bit of a red herring or more a side effect - with the command simply complaining that the state file couldn't be read. When in reality the state capability wasn't being advertised by the plugin and so no state file was even being produced.
While investigating that we realized the `meltano elt --dump=state` wasn't actually hitting the system db directly. It indirectly makes a bare `_invoke()` call using [plugin_invoker.dump](https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/plugin_invoker.py#L288), which behind the scenes triggers the [look_up_state_hook](https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/plugin/singer/tap.py#L165) - which in turn tries to read the state from the db, and if successful then [writes out the state to the state file](https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/plugin/singer/tap.py#L246). After `_invoke()` completes, dump just tries to the read state file.
1. If we can deprecate/phase out the use of a state file, we could just update have the dump pull the state straight from the system db and go ahead and simplify the code in the tap ditching the state file generation code.
2. If we need to continue generating the state file - dump could still do that - but we'd definitely want to refactor dump and look_up_state_hook to share some code.
With either path, a single consolidated look_up_state method for both the tap and dump to use is probably the move.
| 4 |
7,603,319 | 96,974,985 |
2021-11-09 01:07:15.626
|
Meltano should print a warning if any of these are missing from capabilities: (`state`, `discover`, (`catalog` XOR `properties`))
|
There are enough instances of expensive debug operations at this point (most recently [today](https://meltano.slack.com/archives/CKHP6G5V4/p1636419311194200?thread_ts=1636399782.175500&cid=CKHP6G5V4)), that I think we should be printing an explicit warning message whenever any of these capabilities are not found in `discovery.yml`:
1. If `state` capability is not declared, then during `meltano elt`:
> Warning: Capability 'state' is not declared for `{tap-name}`. Incremental sync bookmarks will be disabled as a result.
2. If `discover` capability is not declared, then during `meltano elt` and `meltano invoke`:
> Warning: Capability 'discover' is not declared for `{tap-name}`. Proceeding without catalog discovery.
3. If neither `catalog` nor `properties` capability is declared, then during `meltano invoke` and `meltano etl`:
> Warning: Capabilities `catalog` and `properties` are both undeclared for `{tap-name}`. Catalog and selection features are disabled.
4. During `add --custom`, set default capabilities to `discover,state,catalog` rather than `[]`.
5. Warn if `capabilities` is fully empty, during `meltano invoke` and perhaps also during `meltano install`.
6. `--dump` commands should fail immediately with a clear message about missing declared capabilities.
1. `--dump=catalog` should fail hard for missing `discover`, should warn for missing `catalog`.
1. `--dump=state` should fail hard for missing `state`.
I think these should probably print even when `--log-level=debug` is not set.
| 4 |
7,603,319 | 96,808,344 |
2021-11-05 15:57:36.935
|
One-Step Publish for Meltano and SDK
|
We have a one-click prerelease publish in the SDK. That's a simpler problem because we don't have release notes and the version number is just an auto-generated `dev` bump using the CI build number.
Opening this issue to figure out how we get to similar simplicity for normal deployments.
## Goals
1. Continue frequent releases.
2. Reduce cost to dev productivity.
3. Make dual-channel releases (`edge`/`main`) more viable, as proposed in https://gitlab.com/meltano/meta/-/issues/135
## Statu Quo
Here are the current manual deployments steps (excluding marketing/slack/tweets, etc.):
1. Open a release branch. (Triggers an auto-commit with the version bump.)
2. Open a tag from the release branch. (Triggers the PyPi publish.)
3. Groom the changelog. (Can be performed before, after, or in parallel with the above - since the changelog itself is not shipped.)
## Ideal State
### Changelog automation
I don't know as much about this space but I believe there are some available options.
### One-Step version bump, validation, and publish
Rather than requiring the tag to be created explicitly in a separate step, we can modify the CI pipeline to _also_ run the publish step in the special CI job that runs when a new branch is created like `release/*`.
The special CI pipeline for `release/*` tags can be removed in favor of the branch pipeline, to eliminate the need for a second step, and avoid "chaining" of auto-commits and CI jobs. We could still create the tag as part of the CI pipeline, rather than as a manual step.
### Proposed flow
Releases:
1. As release manager, I create a correctly-named release branch which triggers the release pipeline.
2. I click "Merge when pipeline succeeds" immediately - meaning, a successful CI run (including publish steps) will close out the version bump.
Change Log Grooming:
1. If needed, I create and merge a _separate_ changelog grooming MR.
| 8 |
7,603,319 | 96,807,828 |
2021-11-05 15:46:54.033
|
Block interface tests and mvp implementation
|
Issue to discuss and track our first block implementation.
**Draft/placeholder - actively editing ~Florian**
----
#### Mergeable/deliverables:
- [x] ExtractLoadBlocks
- [x] PluginCommandBlock
- [x] IOBlock - which is parent of:
- [ ] PublisherBlock
- [ ] ConsumerBlock
- [ ] TranslatorBlock
- [x] sample unit tests demonstrating construciton of and use of the ExtractLoadBlocks
| 12 |
7,603,319 | 96,622,206 |
2021-11-02 21:22:40.816
|
Support a Python-based Plugin Architecture
|
Currently all Python-specific plugin handlers live in our main `Meltano` repo and is installed in a single go.
To make a leaner install, we could (for instance) refactor "Singer EL", "dbt Transform", and "Airflow Orchestrator" into separate repositories.
For each type of plugin we want to support, we'd have to create an interface layer and decide which hooks/notifications/invocations are needed by the respective type of plugin.
Note: The refactoring effort for the Singer EL would likely be very large. It might make more sense to use orchestrators (Dagster or Airflow) first.
## How is this different?
Today, if we want to support Dagster in the same way we support Airflow, we have three levers we can control:
1. CLI-based 'command' definitions. This is basically just aliasing existing CLI commands from the tool. Not capable of deep integrations.
2. Files-based plugins. This is the least robust: we download some pre-configured files into the repo.
3. Python code-based approach. This is the only "deep" integration option we have today. This gives a lot of power to wrap the tool's capabilities, but it also requires expanding the code base of `Meltano` core repo.
This proposal would give an alternate path to the third option, enabling us to write an external plugin definition in python, and import it into the project without having it defined tightly as part of that repo.
| 40 |
7,603,319 | 96,616,553 |
2021-11-02 18:56:27.586
|
Prettier exception prints
|
On the structlog site, there's some example screenshots of exceptions that are inherently very easy to read due to formatting and structure.
Should we look at adopting something like [rich](https://rich.readthedocs.io/) or [better-exceptions](https://github.com/Qix-/better-exceptions) for Meltano and/or the SDK?
The docs for structlog seem to imply we only need to install the libraries, but I expect it may be slightly more work, and there could be reasons I'm not thinking of why we would not want to use them.
https://www.structlog.org/en/stable/development.html:

| 2 |
7,603,319 | 96,352,662 |
2021-10-28 20:23:09.893
|
elt jobs appear to complete successfully but exit with return code 1 occasionally
|
<!---
Please read this!
Before opening a new issue, make sure to search for keywords in the issues
filtered by the "regression" or "bug" label and verify the issue you're about to submit isn't a duplicate.
If you are submitting an issue with a tap, please include:
- account details
- target details
- entities selected with meltano select (if you have selected any entities), as the bug may be related to a specific entity
- the full elt command you are running
- full output of the meltano elt command. Logs can get pretty long, so you can add the full log as a snippet in the Meltano project and add a link in the issue.
--->
### What is the current *bug* behavior?
As reported by a user in slack (https://meltano.slack.com/archives/C01TCRBBJD7/p1635428092076900). They're intermittently running into a situation where elt commands appear based on the output logs that they've completed successfully, but we actually have n exit status/return code of 1.
They also reported that the job is marked as failed in the meltano db with the error message:
```
{"error": "No heartbeat recorded for 5 minutes. The process was likely killed unceremoniously."
```
### What is the expected *correct* behavior?
_What should be happening?_
### Steps to reproduce
Unknown/TBD at this point.
### Relevant logs and/or screenshots
```
[2021-10-28 08:18:56,028] {bash_operator.py:157} INFO - meltano | elt | Incremental state has been updated at 2021-10-28 08:18:56.027750.
[2021-10-28 08:18:58,706] {bash_operator.py:157} INFO - meltano | elt | Extract & load complete!
[2021-10-28 08:18:58,707] {bash_operator.py:157} INFO - meltano | elt | Transformation skipped.
[2021-10-28 08:18:58,727] {bash_operator.py:157} INFO - (<State.FAIL: (3, ('RUNNING',))>, <State.SUCCESS: (2, ())>)
[2021-10-28 08:19:00,227] {bash_operator.py:161} INFO - Command exited with return code 1
[2021-10-28 08:19:00,385] {taskinstance.py:1150} ERROR - Bash command failed
Traceback (most recent call last):
File "/project/.meltano/orchestrators/airflow/venv/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
result = task_copy.execute(context=context)
File "/project/.meltano/orchestrators/airflow/venv/lib/python3.6/site-packages/airflow/operators/bash_operator.py", line 165, in execute
raise AirflowException("Bash command failed")
airflow.exceptions.AirflowException: Bash command failed
[2021-10-28 08:19:00,387] {taskinstance.py:1194} INFO - Marking task as FAILED. dag_id=meltano_secondary_full_table, task_id=extract_load, execution_date=20211027T070000, start_date=20211028T074156, end_date=20211028T081900
[2021-10-28 08:19:01,488] {local_task_job.py:102} INFO - Task exited with return code 1
```
### Possible fixes
### Further regression test
_Ensure we automatically catch similar issues in the future_
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 4 |
7,603,319 | 96,352,217 |
2021-10-28 20:12:20.489
|
Add lock artifacts during `meltano add` for stability and portability of plugin definitions
|
This has come up twice this week, especially in context of pipelines and integrations with external tools.
## Original proposal
Original proposal recommends an actual `.lock.yml` file.
<details><summary>Click to expand</summary>
1. `meltano lock` would generate or update a lock file - also in yaml format.
2. This would resolve all dependencies, including the `discovery.yml` ref and combination of all `include-path` files for multi-meltano files.
In future iterations this could be expanded to:
3. Include `catalog.json` caches.
4. Resolve all config representations of `environments` so that each environment has a fully resolved schema (_except_ secrets and the content of any externally-managed environment variables.)~~ - Not needed in initial release.
5. Optionally, include rich information of the DAG structure for known/declared pipelines or schedules.
Based on this lock file artifact, external tools such as Terraform, Databand, etc. could build environments and/or obtain rich project metadata needed for integration.
This may also provide improved stability projects for projects running in the wild, as discussed in this comment: https://gitlab.com/groups/meltano/-/epics/128#note_734475476
</details>
### Alternate implementation option (2022-02-10)
Here's an alternative that just focuses on freezing the plugin definitions, ideally copying those verbatim from their source of truth on hub.meltano.com
#### Stable Installations via `meltano add`
We alter how `meltano add` installs new plugins:
1. Instead of installing directly from `discovery.yml` and relying on the `discovery.yml` definition for resolution, we go to https://hub.meltano.com and download the file associated with the plugin.
2. The downloaded definition file will be installed into `${MELTANO_PROJECT_DIR}/plugins/` as `${MELTANO_PROJECT_DIR}/plugins/extractors/tap-mytap--variantname.yml`.
3. An optional `plugin_definitions_path: <str>` config option in `meltano.yml` will allow users to change the default path. This would simply default to `${MELTANO_PROJECT_DIR}/plugins` if not set by the user.
- This is separate from [include_paths](https://docs.meltano.com/concepts/project#multiple-yaml-files) in `meltano.yml`, since (1) we need to know exactly where to download new definitions and (2) the plugin definition files adhere to a new plugin spec format and not to the `meltano.yml` project format.
4. After downloading the definition, the plugin entry will be added to `meltano.yml` exactly as is performed today.
5. Instead of resolving setting definitions via `discovery.yml`, `meltano.yml` will only refer to the downloaded definitions.
- This will require a one-time update to our parser to also interpret and import data from the native/standard plugin definitions from hub.meltano.com.
#### Definition updates via `meltano add --upgrade`
We add something like `meltano add extractor tap-mytap --upgrade` which will replace the current definition with the new definition.
#### Migration to locked definitions via `meltano install --lock` or `meltano install --freeze`
For projects already in the wild, we can add a `--lock` or `--freeze` option to `meltano install` which generates a "locked" plugin definition file for any which are missing. As above, this would create one file per plugin used in the project. If a plugin definition already exists, we skip adding that file.
#### "Single-file option"
I think we've declined this option so I'm collapsing it. Please sound off in comments if you have a strong preference otherwise.
<details><summary>Click to expand</summary>
~~As an alternative to downloading and adding files individually, we could follow basically the same behavior as described, except we import/merge definitions in a single file called something at the root of the Meltano project dir like `plugins-registry.yml`, `registered-plugins.yml`, or `plugins-definitions.yml`. Since it would be a single file, we could have it at a fixed path relative to the root, and perhaps in the future allow the user to specify their own path to that file.
(AJ Speaking) I'm personally less inclined towards the single-lock file approach after thinking on this more. The one-file-per-plugin approach creates cleaner file diffs and is more scalable for a large number of plugins.~~
</details>
#### Deprecating `discovery.yml`
After implementing this proposal, `discovery.yml` as it exists today would be fully deprecated. Projects which have not yet had their definitions locked would still fall back to the bundled `discovery.yml` file. For the purposes of not breaking users on older versions, we would simply leave the online version of this file (https://discovery.meltano.com/discovery.yml) frozen at a point in time without needing to continue maintaining updates in `discovery.yml`.
#### Spec versioning
As @aphethean calls out in a thread on this issue https://gitlab.com/meltano/meltano/-/issues/3031#note_842196683, we also need to handle spec versioning so that the file spec can evolve without breaking users.
For instance, @aphethean provides a sample used by Kubernetes (`apiVersion: networking.k8s.io/v1beta1`) and we can see something similar in `dbt`'s use of `config-version: 2`.
This makes sure we meet this requirement as expressed in the comment:
> 1. ‘meltano add extractor tap-csv’
> 1. Aeons later, tap-csv plugin definition will always work with my meltano project.
| 12 |
7,603,319 | 96,211,878 |
2021-10-27 02:41:01.317
|
CLI command `meltano upgrade` does not work when installed with `pipx`
|
There appears to be a bug in `meltano upgrade` when installed via `pipx install`.
1. Install with pipx: `pipx install meltano`.
2. Wait some time.
3. Print version to confirm version is stale. (`1.82.0` below)
4. Run `meltano upgrade`. Logs appear to successfully upgrade to the latest.
5. Print version (`meltano --version`) to confirm version is still stale. (still `1.82.0` in log below, should be `1.85.0`)
## Expected behavior
We should make `meltano upgrade` work with pipx-installed instances of `meltano`.
If not feasible, or as a stop gap, we could print a warning:
`Error. Detected pipx-managed environment. Please retry upgrade by running 'pipx upgrade meltano'.`
## Workaround
The simple workaround is just to use `pipx upgrade meltano` instead of `meltano upgrade`.
## Log
<details><summary>Click to expand</summary>
```console
aj@ajs-macbook-pro meltano % meltano --version
meltano, version 1.82.0
aj@ajs-macbook-pro meltano % meltano upgrade
Upgrading `meltano` package...
Collecting meltano
Downloading meltano-1.85.0.tar.gz (3.7 MB)
|████████████████████████████████| 3.7 MB 2.0 MB/s
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing wheel metadata ... done
Requirement already satisfied: networkx<3.0,>=2.2 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (2.6.2)
Requirement already satisfied: atomicwrites<2.0.0,>=1.2.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.4.0)
Requirement already satisfied: sqlalchemy<2.0.0,>=1.3.19 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.4.22)
Requirement already satisfied: click<8.0,>=7.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (7.1.2)
Requirement already satisfied: async_generator<2.0,>=1.10 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.10)
Requirement already satisfied: pyhocon<0.4.0,>=0.3.51 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.3.58)
Collecting jsonschema<4.0.0,>=3.0.0
Using cached jsonschema-3.2.0-py2.py3-none-any.whl (56 kB)
Requirement already satisfied: gunicorn<20.0.0,>=19.9.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (19.10.0)
Requirement already satisfied: flask-executor<0.10.0,>=0.9.2 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.9.4)
Requirement already satisfied: flask<2,>=1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.1.4)
Requirement already satisfied: ipython<8.0.0,>=7.5.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (7.26.0)
Requirement already satisfied: psycopg2-binary<3.0.0,>=2.8.5 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (2.9.1)
Requirement already satisfied: python-dotenv<0.15.0,>=0.14.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.14.0)
Requirement already satisfied: simplejson<4.0.0,>=3.16.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (3.17.3)
Requirement already satisfied: pyhumps==1.2.2 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.2.2)
Requirement already satisfied: smtpapi<0.5.0,>=0.4.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.4.7)
Requirement already satisfied: pyyaml<6.0.0,>=5.3.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (5.4.1)
Collecting werkzeug<2,>=1
Using cached Werkzeug-1.0.1-py2.py3-none-any.whl (298 kB)
Requirement already satisfied: aiohttp<4.0.0,>=3.4.4 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (3.7.4.post0)
Requirement already satisfied: flatten-dict<0.2.0,>=0.1.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.1.0)
Requirement already satisfied: sqlparse<0.4.0,>=0.3.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.3.1)
Requirement already satisfied: click-default-group<2.0.0,>=1.2.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.2.2)
Requirement already satisfied: requests<3.0.0,>=2.23.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (2.26.0)
Requirement already satisfied: flask-cors<4.0.0,>=3.0.7 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (3.0.10)
Requirement already satisfied: alembic<2.0.0,>=1.5.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.6.5)
Requirement already satisfied: bcrypt<4.0.0,>=3.2.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (3.2.0)
Requirement already satisfied: authlib<0.11,>=0.10 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.10)
Requirement already satisfied: flask-restful<0.4.0,>=0.3.7 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.3.9)
Requirement already satisfied: psutil<6.0.0,>=5.6.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (5.8.0)
Requirement already satisfied: snowflake-sqlalchemy<2.0.0,>=1.2.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.3.1)
Requirement already satisfied: markdown<4.0.0,>=3.0.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (3.3.4)
Requirement already satisfied: fasteners<0.16.0,>=0.15.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.15)
Requirement already satisfied: meltano-flask-security<0.2.0,>=0.1.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.1.0)
Requirement already satisfied: email-validator<2.0.0,>=1.1.2 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.1.3)
Requirement already satisfied: pypika<0.26.0,>=0.25.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.25.2)
Requirement already satisfied: watchdog<0.10.0,>=0.9.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (0.9.0)
Requirement already satisfied: python-gitlab<2.0.0,>=1.8.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (1.15.0)
Requirement already satisfied: flask-sqlalchemy<3.0.0,>=2.4.4 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano) (2.5.1)
Requirement already satisfied: attrs>=17.3.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.4.4->meltano) (21.2.0)
Requirement already satisfied: chardet<5.0,>=2.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.4.4->meltano) (4.0.0)
Requirement already satisfied: async-timeout<4.0,>=3.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.4.4->meltano) (3.0.1)
Requirement already satisfied: yarl<2.0,>=1.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.4.4->meltano) (1.6.3)
Requirement already satisfied: typing-extensions>=3.6.5 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.4.4->meltano) (3.10.0.0)
Requirement already satisfied: multidict<7.0,>=4.5 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from aiohttp<4.0.0,>=3.4.4->meltano) (5.1.0)
Requirement already satisfied: python-editor>=0.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from alembic<2.0.0,>=1.5.0->meltano) (1.0.4)
Requirement already satisfied: Mako in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from alembic<2.0.0,>=1.5.0->meltano) (1.1.4)
Requirement already satisfied: python-dateutil in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from alembic<2.0.0,>=1.5.0->meltano) (2.8.2)
Requirement already satisfied: cryptography in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from authlib<0.11,>=0.10->meltano) (3.4.7)
Requirement already satisfied: six>=1.4.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from bcrypt<4.0.0,>=3.2.0->meltano) (1.16.0)
Requirement already satisfied: cffi>=1.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from bcrypt<4.0.0,>=3.2.0->meltano) (1.14.6)
Requirement already satisfied: pycparser in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from cffi>=1.1->bcrypt<4.0.0,>=3.2.0->meltano) (2.20)
Requirement already satisfied: dnspython>=1.15.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from email-validator<2.0.0,>=1.1.2->meltano) (2.1.0)
Requirement already satisfied: idna>=2.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from email-validator<2.0.0,>=1.1.2->meltano) (3.2)
Requirement already satisfied: monotonic>=0.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from fasteners<0.16.0,>=0.15.0->meltano) (1.6)
Requirement already satisfied: itsdangerous<2.0,>=0.24 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from flask<2,>=1->meltano) (1.1.0)
Requirement already satisfied: Jinja2<3.0,>=2.10.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from flask<2,>=1->meltano) (2.11.3)
Requirement already satisfied: pytz in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from flask-restful<0.4.0,>=0.3.7->meltano) (2021.1)
Requirement already satisfied: aniso8601>=0.82 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from flask-restful<0.4.0,>=0.3.7->meltano) (9.0.1)
Requirement already satisfied: pathlib2>=2.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from flatten-dict<0.2.0,>=0.1.0->meltano) (2.3.6)
Requirement already satisfied: setuptools>=18.5 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (56.0.0)
Requirement already satisfied: appnope in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (0.1.2)
Requirement already satisfied: pexpect>4.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (4.8.0)
Requirement already satisfied: backcall in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (0.2.0)
Requirement already satisfied: matplotlib-inline in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (0.1.2)
Requirement already satisfied: pygments in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (2.9.0)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (3.0.19)
Requirement already satisfied: decorator in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (5.0.9)
Requirement already satisfied: pickleshare in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (0.7.5)
Requirement already satisfied: traitlets>=4.2 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (5.0.5)
Requirement already satisfied: jedi>=0.16 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from ipython<8.0.0,>=7.5.0->meltano) (0.18.0)
Requirement already satisfied: parso<0.9.0,>=0.8.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from jedi>=0.16->ipython<8.0.0,>=7.5.0->meltano) (0.8.2)
Requirement already satisfied: MarkupSafe>=0.23 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from Jinja2<3.0,>=2.10.1->flask<2,>=1->meltano) (2.0.1)
Collecting pyrsistent>=0.14.0
Downloading pyrsistent-0.18.0-cp39-cp39-macosx_10_9_x86_64.whl (68 kB)
|████████████████████████████████| 68 kB 2.3 MB/s
Requirement already satisfied: Flask-Mail>=0.7.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano-flask-security<0.2.0,>=0.1.0->meltano) (0.9.1)
Requirement already satisfied: Flask-Principal>=0.3.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano-flask-security<0.2.0,>=0.1.0->meltano) (0.4.0)
Requirement already satisfied: Flask-BabelEx>=0.9.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano-flask-security<0.2.0,>=0.1.0->meltano) (0.9.4)
Requirement already satisfied: passlib>=1.7 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano-flask-security<0.2.0,>=0.1.0->meltano) (1.7.4)
Requirement already satisfied: Flask-Login>=0.3.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano-flask-security<0.2.0,>=0.1.0->meltano) (0.5.0)
Requirement already satisfied: Flask-WTF>=0.13.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from meltano-flask-security<0.2.0,>=0.1.0->meltano) (0.15.1)
Requirement already satisfied: speaklater>=1.2 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from Flask-BabelEx>=0.9.3->meltano-flask-security<0.2.0,>=0.1.0->meltano) (1.3)
Requirement already satisfied: Babel>=1.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from Flask-BabelEx>=0.9.3->meltano-flask-security<0.2.0,>=0.1.0->meltano) (2.9.1)
Requirement already satisfied: blinker in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from Flask-Mail>=0.7.3->meltano-flask-security<0.2.0,>=0.1.0->meltano) (1.4)
Requirement already satisfied: WTForms in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from Flask-WTF>=0.13.1->meltano-flask-security<0.2.0,>=0.1.0->meltano) (2.3.3)
Requirement already satisfied: ptyprocess>=0.5 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from pexpect>4.3->ipython<8.0.0,>=7.5.0->meltano) (0.7.0)
Requirement already satisfied: wcwidth in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython<8.0.0,>=7.5.0->meltano) (0.2.5)
Requirement already satisfied: pyparsing>=2.0.3 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from pyhocon<0.4.0,>=0.3.51->meltano) (2.4.7)
Requirement already satisfied: aenum==2.1.2 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from pypika<0.26.0,>=0.25.1->meltano) (2.1.2)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from requests<3.0.0,>=2.23.0->meltano) (1.26.6)
Requirement already satisfied: certifi>=2017.4.17 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from requests<3.0.0,>=2.23.0->meltano) (2021.5.30)
Requirement already satisfied: charset-normalizer~=2.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from requests<3.0.0,>=2.23.0->meltano) (2.0.4)
Requirement already satisfied: snowflake-connector-python<3.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (2.5.1)
Requirement already satisfied: oscrypto<2.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (1.2.1)
Requirement already satisfied: azure-storage-blob<13.0.0,>=12.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (12.8.1)
Requirement already satisfied: pycryptodomex!=3.5.0,<4.0.0,>=3.2 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (3.10.1)
Requirement already satisfied: asn1crypto<2.0.0,>0.24.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (1.4.0)
Requirement already satisfied: azure-common<2.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (1.1.27)
Requirement already satisfied: pyjwt<3.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (2.1.0)
Requirement already satisfied: boto3<2.0.0,>=1.4.4 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (1.18.21)
Requirement already satisfied: pyOpenSSL<21.0.0,>=16.2.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (20.0.1)
Requirement already satisfied: msrest>=0.6.18 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from azure-storage-blob<13.0.0,>=12.0.0->snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (0.6.21)
Requirement already satisfied: azure-core<2.0.0,>=1.10.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from azure-storage-blob<13.0.0,>=12.0.0->snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (1.17.0)
Requirement already satisfied: botocore<1.22.0,>=1.21.21 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from boto3<2.0.0,>=1.4.4->snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (1.21.21)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from boto3<2.0.0,>=1.4.4->snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (0.10.0)
Requirement already satisfied: s3transfer<0.6.0,>=0.5.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from boto3<2.0.0,>=1.4.4->snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (0.5.0)
Requirement already satisfied: requests-oauthlib>=0.5.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from msrest>=0.6.18->azure-storage-blob<13.0.0,>=12.0.0->snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (1.3.0)
Requirement already satisfied: isodate>=0.6.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from msrest>=0.6.18->azure-storage-blob<13.0.0,>=12.0.0->snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (0.6.0)
Requirement already satisfied: oauthlib>=3.0.0 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from requests-oauthlib>=0.5.0->msrest>=0.6.18->azure-storage-blob<13.0.0,>=12.0.0->snowflake-connector-python<3.0.0->snowflake-sqlalchemy<2.0.0,>=1.2.3->meltano) (3.1.1)
Requirement already satisfied: greenlet!=0.4.17 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from sqlalchemy<2.0.0,>=1.3.19->meltano) (1.1.1)
Requirement already satisfied: ipython-genutils in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from traitlets>=4.2->ipython<8.0.0,>=7.5.0->meltano) (0.2.0)
Requirement already satisfied: argh>=0.24.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from watchdog<0.10.0,>=0.9.0->meltano) (0.26.2)
Requirement already satisfied: pathtools>=0.1.1 in /Users/aj/.pyenv/versions/3.9.5/lib/python3.9/site-packages (from watchdog<0.10.0,>=0.9.0->meltano) (0.1.2)
Building wheels for collected packages: meltano
Building wheel for meltano (PEP 517) ... done
Created wheel for meltano: filename=meltano-1.85.0-py3-none-any.whl size=3804221 sha256=90f383b7f4f21fbcc1ea83a344e008526e1bebf452501bf856dc1c90b4d6087c
Stored in directory: /Users/aj/Library/Caches/pip/wheels/93/4d/cb/60a6fec735d00bc166eb0897d2a749a4a4ee6c626d323492e7
Successfully built meltano
Installing collected packages: werkzeug, pyrsistent, jsonschema, meltano
Attempting uninstall: werkzeug
Found existing installation: Werkzeug 0.16.1
Uninstalling Werkzeug-0.16.1:
Successfully uninstalled Werkzeug-0.16.1
Attempting uninstall: jsonschema
Found existing installation: jsonschema 2.6.0
Uninstalling jsonschema-2.6.0:
Successfully uninstalled jsonschema-2.6.0
Successfully installed jsonschema-3.2.0 meltano-1.85.0 pyrsistent-0.18.0 werkzeug-1.0.1
The `meltano` package has been upgraded.
Reloading UI...
UI is not running
Updating files managed by plugins...
Updating file bundle 'dbt'...
Updated file bundle 'dbt'
Updating 'dbt' files in project...
Nothing to update
Applying migrations to system database...
[2021-10-26 19:30:41,733] [23420|MainThread|alembic.runtime.migration] [INFO] Context impl SQLiteImpl.
[2021-10-26 19:30:41,733] [23420|MainThread|alembic.runtime.migration] [INFO] Will assume non-transactional DDL.
System database up-to-date.
Recompiling models...
Meltano and your Meltano project have been upgraded!
aj@ajs-macbook-pro meltano % meltano --version
meltano, version 1.82.0
```
</details>
And:
```console
aj@ajs-macbook-pro meltano % which meltano
/Users/aj/.local/bin/meltano
aj@ajs-macbook-pro meltano % cat /Users/aj/.local/bin/meltano
#!/Users/aj/.local/pipx/venvs/meltano/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from meltano.cli import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
```
## Addl. info (pipx debug)
This runs as expected and printed paths may be a clue:
```console
aj@ajs-macbook-pro meltano % pipx upgrade --verbose meltano
pipx >(setup:717): pipx version is 0.16.3
pipx >(setup:718): Default python interpreter is '/Users/aj/.pyenv/versions/3.8.10/bin/python3'
pipx >(run_pipx_command:168): Virtual Environment location is /Users/aj/.local/pipx/venvs/meltano
pipx >(needs_upgrade:69): Time since last upgrade of shared libs, in seconds: 25. Upgrade will be run by pipx if greater than 2592000.
pipx >(upgrade:91): Upgrading shared libraries in /Users/aj/.local/pipx/shared
upgrading shared libraries...
pipx >(run_subprocess:135): running /Users/aj/.local/pipx/shared/bin/python -m pip --disable-pip-version-check install --upgrade pip setuptools wheel
pipx >(_parsed_package_to_package_or_url:128): cleaned package spec: meltano
upgrading meltano...
pipx >(run_subprocess:135): running /Users/aj/.local/pipx/venvs/meltano/bin/python -m pip install --upgrade meltano
pipx >(run_subprocess:135): running <fetch_info_in_venv commands>
pipx >(get_venv_metadata_for_package:303): get_venv_metadata_for_package: 10964ms
pipx >(_parsed_package_to_package_or_url:128): cleaned package spec: meltano
pipx >(_symlink_package_apps:124): Same path /Users/aj/.local/bin/meltano and /Users/aj/.local/pipx/venvs/meltano/bin/meltano
upgraded package meltano from 1.82.0 to 1.85.0 (location: /Users/aj/.local/pipx/venvs/meltano)
aj@ajs-macbook-pro meltano % meltano --version
meltano, version 1.85.0
```
| 1 |
7,603,319 | 96,111,328 |
2021-10-25 20:32:35.418
|
Add CLI support for environments
|
In !2355 we did not add support for managing environments with the command line. We should have them! Based on discussion in https://gitlab.com/meltano/meltano/-/merge_requests/2383#note_713553913
At a minimum:
<details><summary>Old version</summary>
* `meltano add <environment>`
* `meltano remove <environment>`
* `meltano config <environment> (list/set/unset/reset)`
</details>
```console
$ meltano environment --help
Usage: meltano environment [OPTIONS] COMMAND [ARGS]...
Options:
--database-uri TEXT System database URI
--help Show this message and exit.
Commands:
add Add a new environment
list Enumerate available environments
remove Remove an environment definition
```
| 8 |
7,603,319 | 96,107,535 |
2021-10-25 19:06:40.230
|
Add Snowplow Tracking to Meltano
| ERROR: type should be string, got "https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/tracking/ga_tracker.py seems to be the primary place where GA Events are defined. All instances of `GoogleAnalyticsTracker` would have to be updated as well.\r\n\r\nThe aim of this issue would be to have 100% parity for Snowplow events with what we had for GA.\r\n\r\n2022-02-24 Update\r\n\r\nAfter syncing with Snowcat Cloud and the team we're aiming to do the following:\r\n\r\n* Get the end-to-end data flowing to both GA and Snowplow\r\n* This means using the structured event format to achieve parity with what we have in GA\r\n * This is a known sub-optimal approach as we aim to overhaul all events with proper schemas and structure\r\n* New events should only be added following improved schema workflow and control\r\n\r\n## Implementation Outline\r\n\r\n<details><summary>(larger scope)</summary>\r\n\r\n- [ ] create a new `MeltanoTracking` class to track usage, with support for many `BaseTracker` instances\r\n- [ ] create a new `BaseTracker` class to host generic tracking methods\r\n- [ ] create new `SnowplowTracker` class, inheriting from `BaseTracker`, to forward to Snowplow\r\n- [ ] adapt `GoogleAnalyticsTracker` to inherit from `BaseTracker`\r\n- [ ] replace all uses of `GoogleAnalyticsTracker` with `MeltanoTracking`\r\n\r\n</details>" | 8 |
7,603,319 | 96,107,301 |
2021-10-25 19:01:44.612
|
Stand-up full Snowplow Infrastructure
|
Based on https://gitlab.com/meltano/meltano/-/issues/2972, we have decided to use and implement Snowplow for the open source event tracking.
Given that the team has experience with AWS and Snowplow, deploying the infrastructure on our own AWS infra is ideal sicne it gives us full control of the data.
This issue tracks the work to get everything stood up on the AWS side. At the end of it, we should have a URL we can send events to which will be enriched and landed in S3. [This project](https://gitlab.com/meltano/saved-from-gitlab/gitlab-com-infrastructure/-/tree/master/environments/aws-snowplow) can serve as a detailed guide on how to get it working.
https://gitlab.com/gitlab-com/gitlab-com-infrastructure/-/tree/master/environments/aws-snowplow is also available!
| 12 |
7,603,319 | 96,090,843 |
2021-10-25 14:24:43.999
|
Add role and batch_wait_limit_seconds setting to default target-snowflake
|
The Wise variant is missing the optional role setting https://github.com/transferwise/pipelinewise-target-snowflake#configuration-settings
batch_wait_limit_seconds
Integer(Default: None)
Maximum time to wait for batch to reach batch_size_rows.
@pnadolny13 we should just do a full review to make sure everything is captured.
| 1 |
7,603,319 | 96,054,288 |
2021-10-25 06:08:14.807
|
Decide on input variables for `terraform-meltano-aws` and similar IAC modules
|
As a follow-on from the conversation in #2987 (internal), I thought it would be helpful to start discussing what configuration levels would be built into our Meltano IAC module for Terraform. To be sure that we have a specific-enough proposal without getting overwhelmed, I think we can describe this in AWS constructs _first_ and later translate these into the relevant constructs for GCP, Azure, etc.
## Functional data project configuration
These inputs drive core functionality of the Meltano project:
- `meltano_project_file` (str) - The path to a `meltano.yml` file to use in configuration of the environment. This would be [parsed by terraform](https://www.terraform.io/docs/language/functions/yamldecode.html) and may affect which resources Terraform creates, and how many. (Altering `meltano.yml` may require rerunning `terraform apply`.)
- `meltano_project_dir` (str) - The path to the Meltano project folder containing job definitions, dbt transforms, etc.
- `meltano_environment_name` (str) - The name of the environment to deploy, corresponding to a key in the `environments` entry in `meltano.yml`. This drives environment variable names and other configs as needed for the chosen environment.
- `project_shortname` (str) - When AWS resources are created, they will be created with this text in the resource name.
- `aws_region` (required, str) - The region ID (e.g. `us-east-2`)
To keep the config DRY, most/all other configuration would ideally be migrated into _either_ the `environments` feature in `meltano.yml` _or_ the base plugin or utility config. Then `meltano.yml` (and possibly also `discovery.yml`) would be read as an input to the Terraform module's configuration.
## Optional, AWS infrastructure reuse and interop:
These inputs are all optional, and they allow the Terraform module to reuse existing AWS resources if desired:
- `use_existing_vpc_id` (optional, str) - The existing VPC ID to use. If null, a new VPC will be created.
- `use_existing_private_subnets` (optional, list(str)) - The existing private subnets to use. If null, a new VPC will be created.
- `use_existing_public_subnets` (optional, list(str)) - The existing private subnets to use. If null, a new VPC will be created.
- `use_existing_s3_logging_path` (optional, str) - The S3 path to use for logging purposes, when S3 log location is needed. If not provided, a new S3 bucket will be created.
- `use_existing_s3_artifacts_path` (optional, str) - The S3 path to use for storing temporary and long-term internal project artifacts, such as lambda function definitions, zip artifacts, and other internal project file storage. If not provided, a new S3 bucket will be created.
There may be more here but generally the config would follow the same pattern. In the first iterations, these may be all automatic - meaning we may not at first allow reusing existing resources. However, as the module matures, it should accept an increasing amount of interop and reuse with other environments, either for reasons of cost savings (RDS, K8 cluster, etc.) or corporate compliance policies (subnets, logging bucket, etc.). Most of these resources are free or have negligible cost, so probably would not need to prioritize reuse except for compliance reasons and for interop with existing application infrastructure.
| 8 |
7,603,319 | 95,910,043 |
2021-10-21 17:14:48.628
|
Always print `--log-level=debug` guidance after failure
|
In slack we are often providing the guidance that people can rerun their pipelines with `--log-level=debug` - generally needed when a tap or target cannot properly configure itself or fails auth.
What do folks think about _always_ printing this after failure?
Something like:
> .... has failed. For more detailed log messages check '...' log directory or rerun with the --log-level=debug CLI flag.
Related slack thread: https://meltano.slack.com/archives/C013EKWA2Q1/p1634836288002100?thread_ts=1634832556.001800&cid=C013EKWA2Q1
| 4 |
7,603,319 | 95,821,081 |
2021-10-20 19:53:31.034
|
Uninstall documentation
|
### Problem to solve
Need to make it clearer that the way to force a reinstall of a tap is to delete the directory in the `.meltano` directory.
Currently there is documentation on how to [install](https://meltano.com/docs/plugin-management.html#custom-plugins) a custom plugin, but especially when using custom taps, it isn't immediately clear how to remove them.
### Target audience
Developers building custom taps and using meltano as part of the development process.
### Further details
This currently is relatively undocumented, only in [Slack](https://meltano.slack.com/archives/CFG3C3C66/p1594084482487700?thread_ts=1593999699.428400&cid=CFG3C3C66) is it made clear that removing files in `.meltano/extractors` will trigger a complete reinstall, ie once deleted, `meltano install <type> <name> --reinstall` will reinstall it.
### Proposal
Provide a proper reinstall option, ie `meltano install <type> <name> --reinstall`, with the same impact of a `rm -rf .meltano/extractors/tap-<name>
Currently in the [docs](https://meltano.com/docs/plugin-management.html#installing-your-project-s-plugins) it indicates that reinstall is possible, but this does not do a complete reinstall.
> To (re)install a specific plugin in your project, use meltano install <type> <name>, e.g. meltano install extractor tap-gitlab.
### What does success look like, and how can we measure that?
Able to remove a tap without deleting files in `.meltano`
### Links / references
* [Install docs](https://meltano.com/docs/plugin-management.html#installing-your-project-s-plugins)
* [Slack conversation](https://meltano.slack.com/archives/CFG3C3C66/p1594084482487700?thread_ts=1593999699.428400&cid=CFG3C3C66)
_Please note that this was taken from GitLab, to be changed accordingly_
| 1 |
7,603,319 | 95,641,264 |
2021-10-18 21:12:26.578
|
Python 3.6 EOL
|
The official EOL date for Python 3.6 is 2021-12-23: https://endoflife.date/python
Users can check their Python version by running `python --version` at the command line.
Beginning in Jan 2022, new releases of Meltano and the SDK will no longer offer support for Python 3.6.
| 2 |
7,603,319 | 95,439,962 |
2021-10-14 15:57:42.907
|
meltano | Transformation failed (2): Env var required but not provided: 'MELTANO_LOAD_SCHEMA'
|
According to @DouweM, this can occur if a custom loader that doesn’t define a schema setting, so the env var doesn’t get populated and dbt doesn’t know what schema to read from.
I think Meltano is probably functioning as designed this feels like a bug to me. In a world where we have 10s of taps loading to the same environment (#2869), dbt should probably be mapping to schema or tap names statically (in yaml config) rather than dynamically (in env vars).
We should also strive to improve the user experience for users creating their own transformation sets, which will increasingly become the norm as we are moving away from the more fragile `transform` packages that expect specific database dialect (generally postgres) and a specific variant of the given tap.
Labeling as a bug because of the user impact, although in fairness, I do think this is functioning as originally designed.
| 8 |
7,603,319 | 95,246,721 |
2021-10-11 22:13:58.040
|
Problems with environment variables taking precedence over declared ones
|
If a plugin configuration option is set to a specific environment variable, but that config option as a default environment variable it can be configured with, and _both_ are defined, Meltano uses the default one rather than the explicitly declared one.
Example:
```bash
meltano init demo
meltano add extractor tap-gitlab
meltano config tap-gitlab set private_token '$MY_REAL_GITLAB_API_TOKEN'
export GITLAB_API_TOKEN=wrong-token
export MY_REAL_GITLAB_API_TOKEN=right-token
meltano config tap-gitlab
```
```
{
"api_url": "https://gitlab.com",
"private_token": "wrong-token",
"groups": "",
"projects": "",
"ultimate_license": false,
"fetch_merge_request_commits": false,
"fetch_pipelines_extended": false
}
```
To me, this is unexpected behavior. I would expect `GITLAB_API_TOKEN` to be used if I hadn't specified anything for `api_url`, but since I explicitly told Meltano to use `MY_REAL_GITLAB_API_TOKEN` I'd expect that to take precedence.
## Proposed Solution (AJ)
I think we should deprecate `env: CUSTOM_ENV_NAME` and `env_aliases: [...]` in favor of the more explicit `<plugin_name>_<setting_name>`. I've created #3120 to track this.
| 4 |
7,603,319 | 95,061,309 |
2021-10-07 18:37:09.583
|
Hide `Explore` and `Dashboards` in UI by default
|
To help new users, I think we should remove `Explore` and `Dashboards` from the new user experience here:

Rather than have this on by default, I would like users to have to provide an environment variable to enable them, with the caveat that this is legacy functionality and the corresponding activities are not being actively supported.
This is a half-step between removing the functionality entirely and would focus on reducing frustration for new users who naturally and understandably gravitate to the UI experience first.
Presumably, users could optionally type in the URLs directly if needed. Goal here is to reduce discovery during new-user experience, not so much prevent someone who already is using this functionality from leveraging it.
| 1 |
7,603,319 | 94,566,331 |
2021-09-29 18:44:24.961
|
Revert per build network bridge work around
|
As a workaround for https://gitlab.com/meltano/meltano/-/issues/2951 where we were seeing network timeouts on package installs [we deployed a workaround](https://gitlab.com/meltano/meltano/-/commit/4f2ce5b060a45aff37a580710f3b48c1c7215f63) recommended upstream by Gitlab. However, Gitlab has since worked with their provider to resolve the underlying issue.
The workaround we deployed is a subtle change in behavior in how networking in our CI containers is built/configured so we should revert this change until we want to (intentionally) make use of this feature.
| 1 |
7,603,319 | 94,553,384 |
2021-09-29 14:48:49.656
|
API endpoint for testing an extractor instantiates an ABC
|
<!---
Please read this!
Before opening a new issue, make sure to search for keywords in the issues
filtered by the "regression" or "bug" label and verify the issue you're about to submit isn't a duplicate.
If you are submitting an issue with a tap, please include:
- account details
- target details
- entities selected with meltano select (if you have selected any entities), as the bug may be related to a specific entity
- the full elt command you are running
- full output of the meltano elt command. Logs can get pretty long, so you can add the full log as a snippet in the Meltano project and add a link in the issue.
--->
### What is the current *bug* behavior?
_What is happening now?_
From a slack converstation (https://meltano.slack.com/archives/CMN8HELB0/p1632924967171100), it seems like the test endpoint is instantiating the abstract base class `PluginTestService` instead of `ExtractorTestService`.
### What is the expected *correct* behavior?
_What should be happening?_
The endpoint should test an extractor correctly.
### Steps to reproduce
_How one can reproduce the issue?_
Try hitting the `POST /<plugin_ref:plugin_ref>/configuration/test` endpoint.
### Relevant logs and/or screenshots
_Please use code blocks (\`\`\`) to format console output_
```
[2021-09-29 16:13:56,137] [74486|MainThread|meltano.api.app] [ERROR] Exception on /api/v1/orchestrations/extractors/tap-hubspot/configuration/test [POST]
[...]
TypeError: Can't instantiate abstract class PluginTestService with abstract method validate
[2021-09-29 16:13:56,140] [74486|MainThread|meltano.api] [INFO] Error: 500 Internal Server Error: The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.
```
### Possible fixes
_If you can, link to the line of code that might be responsible for the problem or suggest a fix_
Replace `PluginTestService` with `ExtractorTestService` in the function body.
### Further regression test
_Ensure we automatically catch similar issues in the future_
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 1 |
7,603,319 | 94,045,037 |
2021-09-21 14:09:09.498
|
Download Log Button in UI doesnt download most recent one
|
Hi everyone,
I noticed the download button on the meltano UI seem not to work as described. Instead of downloading the most recent log after a Run of a pipeline it somehow gets an older one. I couldn't narrow it down to why exactly that log, but with ongoing runs it keeps jumping back to that specific log all the time.
Best
Nic
| 4 |
7,603,319 | 93,745,232 |
2021-09-15 20:55:12.744
|
Document GitLab's container registry as an option for pulling images
|
Based on a comment from https://gitlab.com/meltano/meltano/-/issues/2934#note_678349501
| 4 |
7,603,319 | 93,616,199 |
2021-09-14 07:23:10.606
|
Make Docker images available in AWS public ECR gallery
|
Using Docker has become pretty annoying if you use a code pipeline. You invariably will get "toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading:".
Since we build stuff with Meltano, we get this quite a bit. Is it possible for Meltano to upload their images to the AWS ECR public registry please as well? See here: https://gallery.ecr.aws/
| 4 |
7,603,319 | 93,649,221 |
2021-09-13 20:12:51.635
|
Create a GitHub Action for Meltano ELT
|
## Summary
[//]: # (Concisely summarize the feature you are proposing.)
GitHub Actions are a CI/CD offering with capabilities similar to those of GitLab CI, except that they can be published to the [a marketplace](https://docs.github.com/en/actions/creating-actions/publishing-actions-in-github-marketplace) where they get more visibility and users can add them to their projects with one click.
This action would be useful for developers that are first trying out Meltano, or don't need a full Kubernetes/cloud deployment with an orchestrator, metrics UIs, etc.
## Proposed benefits
[//]: # (Concisely summarize the benefits this feature would bring to yourself and other users.)
Give users that want to try out Meltano for ELT a way that is quick and easy.
## Proposal details
[//]: # (In as much detail as you are able, describe the feature you'd like to build or would like to see built.)
- A [composite action](https://docs.github.com/en/actions/creating-actions/creating-a-composite-action) is probably easier but we can consider a [docker action](https://docs.github.com/en/actions/creating-actions/creating-a-docker-container-action) too.
Either way, the action would install the latest or some configurable Meltano version for the user and any other dependencies, and also run `meltano install`.
- Inputs to the job could be: `tap`, `target`, `job_id`, `transform` and `log_level` to compose a command like
```shell
meltano --log-level={log_level} elt {tap} {target} --job_id={job_id} --transform={transform}
```
## Best reasons not to build
[//]: # (Will this negatively affect any existing functionality? Do you anticipate any breaking changes versus what may already be working today? Make the counter-argument to your proposal here.)
This only solves ELT and very basic orchestration needs. No fancy UI for jobs and retries are manual.
| 1 |
7,603,319 | 93,464,825 |
2021-09-10 14:09:32.391
|
Create Local Deployment File Bundle
|
Having produced a proof of concept local deployment (#2910) we would now like to publish this as a [File Bundle](https://meltano.com/docs/plugins.html#file-bundles) for installation via `meltano add`. This includes:
- Publishing the Helm Charts for Meltano and Airflow
- Helm expects charts to be hosted under plain http(s) urls. I intend to use GitLab Pages to fulfil this requirement.
- Publishing the Terraform Modules for Kind, Postgres and Docker
- Terraform supports installing modules [directly from a git repository](https://www.terraform.io/docs/language/modules/sources.html#generic-git-repository)
- Creating a File Bundle that includes an `infrastructure/local` dir, `main.tf` file and other artefacts needed to launch the local deployment.
| 12 |
7,603,319 | 93,276,548 |
2021-09-07 16:43:33.852
|
(Meta) Add top-level "job" with "tasks" in Meltano.yml, refactored from just Schedules to Jobs+Schedules
|
#### Update 2022-05-26
* Job support added in https://gitlab.com/meltano/meltano/-/merge_requests/2629 - no related issue as MR was scoped down.
* Support for schedules tracked in https://gitlab.com/meltano/meltano/-/issues/3479
----
Currently a "Schedule" in `meltano.yml` is called a "Pipeline" in the UI. (#1673)
The proposed change is to add `jobs` construct, with one or more steps - similar to the proposal of a multi-step `meltano run` but stored in the yaml file and able to be scheduled.
This is essentially the translation of `meltano run` and `meltano invoke` statements into a DAG definition. Importantly, though, we need to be mindful of how much complexity we want to introduce.
Instead of what we have now that is restricted to just EL pairs:
- A schedule has a tap and target and transform spec (only/skip/etc.).
We'd have:
- A named `job` has one or more `tasks`
- A `schedule` points to a named `job` _or_ the `job` contains a `schedule` description
- Optionally, we can choose to support parallel groups of tasks, such as invoking multiple taps together.
- Parallel groups might run serially if invoked locally via the CLI, but they'd generate a parallel structure when passed to an external orchestrator such as Airflow.
The assumption is that these would all run using `meltano run`.
For example:
```yaml
jobs:
- name: daily-eltp
tasks:
# EL pipelines can run in parallel:
- - tap-shopify target-snowflake
- tap-github target-snowflake
# These dbt build steps run in sequence:
- dbt:run --full-refresh
- dbt:test
# Last mile steps can run in parallel:
- - superset:build # BI build
- tap-snowflake target-yaml # reverse ETL can run in parallel with BI build
- name: hourly-refresh
tasks:
- tap-slack target-snowflake
- dbt:run
# schedules are separate and can reference jobs
schedules:
# ...
```
## Defining steps with rich metadata (out of scope for iteration 1)
We should plan for future support of rich task definitions (could have `name`, `type`, 'needs', etc.) instead of simple strings.
The shorthand of simple string representation is easy to understand, read, and maintain. However, this cannot handle advanced dag dependency representations. Supporting both smartly will likely be needed in future for advanced use cases:
```yaml
jobs:
- name: daily-eltp
tasks:
# A complex task object could be inlined with string-based tasks
- type: invoke
command: dbt:run
config: ... # perhaps overrides here or perhaps some other future syntax options
- dbt:test
- - superset:build # BI build
- tap-snowflake target-yaml # reverse ETL can run in parallel with BI build
```
For each item in the tasks array, if the type is `string`, pass along basically as a `meltano run` arg. If the type is object (dict in python), then parse as a complex task.
## Jobs' relationship to schedules
Schedules likely should be environment specific, but perhaps this should be held until a future iteration.
```yaml
schedules:
- name: daily-eltp-schedule
job: daily-eltp
interval: daily
environments:
- prod
- name: hourly-refresh-schedule
job: hourly-refresh
interval: hourly
environments:
- prod
- dev
#...
```
## Referencing a job from `meltano run`
We would expand `meltano run` to accept job names as well as plugin names.
So assuming 2 jobs are defined, `dbt-daily-refresh` and `daily-el-sync`, these invocations could be valid:
_Possibly out of scope for first iteration_
### Example A: EL steps followed by a named job.
Runs one EL task and then all the tasks in the 'dbt-daily-refresh' job.
```bash
meltano run tap-slack target-snowflake dbt-daily-refresh
```
### Example B: A job followed by a list of commands.
Runs all the EL tasks in the `daily-el-sync` job and then run dbt.
```bash
meltano run daily-el-sync dbt:run dbt:test
```
### Example C: Two sequential jobs.
Runs all the tasks in the `daily-el-sync` job and then all the tasks in the 'dbt-daily-refresh' job.
```bash
meltano run daily-el-sync dbt:run dbt:test
```
## Regarding name collisions
In order for jobs to be referenced in `meltano run`, there cannot be collisions between plugin names and job names.
## CLI Interface
#### jobs
`meltano job <job_name>` add a job
`meltano job <job_name> add schedule <interval>` Schedule the job
`meltano job <job_name> add schedule '[<interval>, ...]' ` Add more schedules
`meltano job <job_name> tasks <run commands>` # should this be set?
`meltano job <job_name> tasks '[<run command>, ...]' `
#### schedules
`meltano schedule <schedule_name> set <job_name> <interval> <environment>`
`meltano schedule <schedule_name> set <job_name>`
`meltano schedule <schedule_name> set <interval>`
`meltano schedule <schedule_name> set <environment>`
`meltano schedule list [--format=json]` Does this work with jobs that are scheduled via the job definition?
`meltano schedule run <schedule_name>` Similar to current behavior this would pass any CLI args to `meltano run`
### meltano run syntax
"Run" would accept a job name in place of a command or plugin name.
```
meltano run <job_name>
```
And job names and command/plugin names can be sequenced:
```
meltano run <job_name> dbt:run dbt:build
```
In this scenario, everything runs sequentially but the user doesn't have to type as much.
## Relation to `meltano invoke`
Jobs are not comatible with this syntax: `meltano invoke <job_name>`.
Instead users should use `meltano run <job_name>` or `meltano schedule run <schedule_name` syntax.
| 8 |
7,603,319 | 93,119,602 |
2021-09-04 14:41:35.581
|
meltano install fails in Ubuntu 20.04 LXD image unless you use venv
|
### What is the current *bug* behavior?
`meltano install` works on my local env, running with a conda built python env.. but when i went to LXD based containers and wanted to use plain ol' python.. it crashed
### What is the expected *correct* behavior?
meltano install should work without needing venv or any other virtual environment.. I am in an LXD container and I understand the risks.. the container is empty except for meltano and my custom tap
### Steps to reproduce
I do a juju charm of meltano in order to setup the env.. without LXD I am not sure you could do it but.. essentially clone a template meltano project into an LXD container... and try to install it
```
version: 1
send_anonymous_usage_stats: false
project_id: SOMEID
plugins:
extractors:
- name: tap-csv
variant: meltano
pip_url: git+https://gitlab.com/meltano/tap-csv.git
config:
files:
- entity: tickers
file: extract/example_data.csv
keys:
- ticker
- name: tap-name
namespace: tap_namespace
executable: /PATH_TO_CUSTOM_TAP.sh
loaders:
- name: target-postgres
variant: datamill-co
pip_url: singer-target-postgres
config:
postgres_host: SOMEIP
postgres_username: SOMEUSERNAME
postgres_database: SOMEDB
postgres_schema: public
disable_collection: true
orchestrators:
- name: airflow
pip_url: apache-airflow==1.10.14 --constraint https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt
files:
- name: airflow
pip_url: git+https://gitlab.com/meltano/files-airflow.git
schedules:
- name: csv-to-postgres
extractor: tap-csv
loader: target-postgres
transform: skip
interval: '@daily'
start_date: 2021-08-07 13:52:51.715954
- name: csv-to-postgres
extractor: tap-csv
loader: target-postgres
transform: skip
interval: '@hourly'
start_date: 2021-08-07 13:52:51.715954
- name: csv-to-postgres
extractor: tap-csv
loader: target-postgres
transform: skip
interval: '@once'
start_date: 2021-08-07 13:52:51.715954
```
### Relevant logs and/or screenshots
_Please use code blocks (\`\`\`) to format console output_
```
meltano --log-level=debug install
[2021-09-04 13:38:02,314] [1172|MainThread|root] [DEBUG] Creating engine <meltano.core.project.Project object at 0x7f20bb37c760>@sqlite:////home/ubuntu/meltano_proj_repo/.meltano/meltano.db
[2021-09-04 13:38:02,359] [1172|MainThread|urllib3.connectionpool] [DEBUG] Starting new HTTPS connection (1): www.meltano.com:443
[2021-09-04 13:38:02,930] [1172|MainThread|urllib3.connectionpool] [DEBUG] https://www.meltano.com:443 "GET /discovery.yml HTTP/1.1" 200 115895
Installing 5 plugins...
[2021-09-04 13:38:03,586] [1172|MainThread|asyncio] [DEBUG] Using selector: EpollSelector
Installing file bundle 'airflow'...
[2021-09-04 13:38:03,587] [1172|MainThread|meltano.core.venv_service] [DEBUG] Removed old virtual environment for 'files/airflow'
[2021-09-04 13:38:03,587] [1172|MainThread|meltano.core.venv_service] [DEBUG] Failed to remove directory tree /home/ubuntu/meltano_proj_repo/.meltano/run/airflow
Traceback (most recent call last):
File "/usr/lib/python3.8/shutil.py", line 707, in rmtree
orig_st = os.lstat(path)
FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/meltano_proj_repo/.meltano/run/airflow'
[2021-09-04 13:38:03,587] [1172|MainThread|meltano.core.venv_service] [DEBUG] Creating virtual environment for 'files/airflow'
Installing orchestrator 'airflow'...
[2021-09-04 13:38:03,592] [1172|MainThread|meltano.core.venv_service] [DEBUG] Removed old virtual environment for 'orchestrators/airflow'
[2021-09-04 13:38:03,593] [1172|MainThread|meltano.core.venv_service] [DEBUG] Failed to remove directory tree /home/ubuntu/meltano_proj_repo/.meltano/run/airflow
Traceback (most recent call last):
File "/usr/lib/python3.8/shutil.py", line 707, in rmtree
orig_st = os.lstat(path)
FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/meltano_proj_repo/.meltano/run/airflow'
[2021-09-04 13:38:03,593] [1172|MainThread|meltano.core.venv_service] [DEBUG] Creating virtual environment for 'orchestrators/airflow'
Installing loader 'target-postgres'...
[2021-09-04 13:38:03,597] [1172|MainThread|meltano.core.venv_service] [DEBUG] Removed old virtual environment for 'loaders/target-postgres'
[2021-09-04 13:38:03,597] [1172|MainThread|meltano.core.venv_service] [DEBUG] Failed to remove directory tree /home/ubuntu/meltano_proj_repo/.meltano/run/target-postgres
Traceback (most recent call last):
File "/usr/lib/python3.8/shutil.py", line 707, in rmtree
orig_st = os.lstat(path)
FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/meltano_proj_repo/.meltano/run/target-postgres'
[2021-09-04 13:38:03,597] [1172|MainThread|meltano.core.venv_service] [DEBUG] Creating virtual environment for 'loaders/target-postgres'
Installing extractor 'tap-ibkr'...
Installing extractor 'tap-csv'...
[2021-09-04 13:38:03,601] [1172|MainThread|meltano.core.venv_service] [DEBUG] Removed old virtual environment for 'extractors/tap-csv'
[2021-09-04 13:38:03,601] [1172|MainThread|meltano.core.venv_service] [DEBUG] Failed to remove directory tree /home/ubuntu/meltano_proj_repo/.meltano/run/tap-csv
Traceback (most recent call last):
File "/usr/lib/python3.8/shutil.py", line 707, in rmtree
orig_st = os.lstat(path)
FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/meltano_proj_repo/.meltano/run/tap-csv'
[2021-09-04 13:38:03,601] [1172|MainThread|meltano.core.venv_service] [DEBUG] Creating virtual environment for 'extractors/tap-csv'
File bundle 'airflow' could not be installed: could not create the virtualenv for 'files/airflow'
Extractor 'tap-csv' could not be installed: could not create the virtualenv for 'extractors/tap-csv'
Orchestrator 'airflow' could not be installed: could not create the virtualenv for 'orchestrators/airflow'
Loader 'target-postgres' could not be installed: could not create the virtualenv for 'loaders/target-postgres'
Installed 0/5 plugins
Skipped installing 1/5 plugins
[2021-09-04 13:38:03,679] [1172|MainThread|meltano.cli.utils] [DEBUG] Failed to install plugin(s)
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/meltano/cli/__init__.py", line 44, in main
cli(obj={"project": None})
File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/usr/lib/python3/dist-packages/click/core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/meltano/cli/params.py", line 23, in decorate
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/meltano/cli/params.py", line 56, in decorate
func(project, *args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/meltano/cli/install.py", line 59, in install
raise CliError("Failed to install plugin(s)")
meltano.cli.utils.CliError: Failed to install plugin(s)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/meltano/cli/__init__.py", line 52, in main
raise CliError(str(err)) from err
meltano.cli.utils.CliError: Failed to install plugin(s)
Failed to install plugin(s)
```
### Possible fixes
_If you can, link to the line of code that might be responsible for the problem or suggest a fix_
once I started using venv within the container.. it all went smoothly and I have meltano up and running inside the Ubuntu LXD container now.. but I shouldn't have to use venv IMO..
### Further regression test
_Ensure we automatically catch similar issues in the future_
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 4 |
7,603,319 | 92,938,667 |
2021-09-01 14:11:00.684
|
Support Deployment to AWS Lambda (Serverless)
|
AWS Lambda provides a fully-managed (serverless) environment that includes both scheduling and compute (including docker). By supporting lambda as a deployment option and orchestrator for Meltano we can provide an easy onramp into the Meltano ecosystem for customers already on AWS but otherwise unwilling to host and manage a self-deployed orchestrator like Airflow.
Integration would include:
- A lambda-compatible docker container for Meltano
- A way to manage state utilising only serverless services (DynamoDB, Aurora Serverless, S3 etc.)
- A way to gracefully handle the 15min max execution time of Lambda.
There are several possible approaches to the final requirement:
- Include a new `--max-execution-time` flag that terminates the tap gracefully (emitting state) even when more upstream records are available. This allows meltano to sync data using successive Lambda invocations without being 'reset' to a previous state due to timeouts, but carries the risk that streams fall behind and never catch up with the upstream source.
- Parallelise pipelines using 'one stream per function' to minimise the risk of several streams in a single pipeline approaching the limit.
| 40 |
7,603,319 | 92,846,507 |
2021-08-31 08:07:24.001
|
Support Deployment Plugins
|
For Meltano to truly be a 'one-stop-shop' for your data platform, it needs to be able to deploy the individual tools and services that make up that platform. To achieve this Meltano should support deployment plugins, being existing tools for deploying and managing running infrastructure (such as Helm, Terraform, Ansible, Salt, Cloudformation). For those just getting started, Meltano should aim to provide opinionated base config for common scenarios (e.g. containers on Kubernetes), and for power users of these deployment tools the config and internals should still be accessible and easily customisable via existing Meltano config routes.
As a rough guide, the CLI could look something like:
```bash
# First a user would add a deployment. This would install all the necessary tooling to actually deploy to this platform during `meltano install`. Meltano will use all of the context of installed plugins, orchestrators and config to create appropriate deployment artefacts.
meltano add deployment --platform kubernetes --environment prod
# Users can optionally configure their deployment
meltano config deployment --environment prod set namespace "team-data"
# Finally a new deployment can be executed 🚀
meltano deploy --environment prod
# Individual tools can also be invoked and used directly
meltano invoke helm list --all-namespaces --all
```
| 40 |
7,603,319 | 92,696,255 |
2021-08-27 12:26:32.387
|
Refresh catalog on every invoke (fresh_catalog: true))
|
I want to have Meltano build a new catalog every run when running `meltano invoke tap-oracle`
A key at the SingerPlugin level probably makes sense, maybe call it `fresh_catalog` default to false?
More specific than #2850 as #2627 didn't solve what I'm after. What I really want is a way to manage and watch my catalog change over time (#2677 / #2805 ), but this issue will be an incremental improvement over where I"m at today.
Today I delete the catalog and cache key from `.meltano/run/tap-name/*`
| 4 |
7,603,319 | 92,230,218 |
2021-08-18 22:02:56.170
|
SIGTERM handling not compatible with Windows
|
As described in #2743, the SIGTERM handling is not currently supported on Windows.
Here is a possible explanation with workaround: https://stackoverflow.com/questions/26061814/signal-handling-in-windows
| 4 |
7,603,319 | 92,148,274 |
2021-08-17 21:50:20.183
|
Refactor to convert plugins to leverage async patterns in hooks
|
As a follow up to https://gitlab.com/meltano/meltano/-/issues/2868 (refactor SingerTarget to own bookmark writer) and a next step to https://gitlab.com/meltano/meltano/-/issues/2725 (Refactor `meltano elt`, `SingerRunner` and `DbtRunner` to be composable). We should refactor the plugins various before/after hooks and supporting methods to be async.
| 8 |
7,603,319 | 91,977,173 |
2021-08-14 17:18:19.568
|
meltano invoke target-csv - 'NoneType' object has no attribute 'full_refresh'
|
Steps to replicate problem
1. Install latest version of meltano
2. `meltano add target-csv`
3. `meltano invoke target-csv`
```:17
(.venv) visch@visch-ubuntu:~/git/meltano/testdir$ meltano --log-level=debug invoke target-csv
[2021-08-14 13:16:28,669] [1885|MainThread|root] [DEBUG] Creating engine <meltano.core.project.Project object at 0x7f7e7383ee50>@sqlite:////home/visch/git/meltano/testdir/.meltano/meltano.db
[2021-08-14 13:16:28,701] [1885|MainThread|urllib3.connectionpool] [DEBUG] Starting new HTTPS connection (1): www.meltano.com:443
[2021-08-14 13:16:28,922] [1885|MainThread|urllib3.connectionpool] [DEBUG] https://www.meltano.com:443 "GET /discovery.yml?project_id=e713709f-f225-4fff-8f19-03f3a5b9e910 HTTP/1.1" 200 115895
[2021-08-14 13:16:29,356] [1885|MainThread|meltano.core.utils] [DEBUG] Variable '$MELTANO_LOAD_SCHEMA' is missing from the environment.
[2021-08-14 13:16:29,358] [1885|MainThread|meltano.core.utils] [DEBUG] Variable '$MELTANO_LOAD_SCHEMA' is missing from the environment.
[2021-08-14 13:16:29,358] [1885|MainThread|root] [DEBUG] Created configuration at /home/visch/git/meltano/testdir/.meltano/run/target-csv/target.871a464e-14e9-4f81-8884-facea30b2ffd.config.json
[2021-08-14 13:16:29,359] [1885|MainThread|root] [DEBUG] Deleted configuration at /home/visch/git/meltano/testdir/.meltano/run/target-csv/target.871a464e-14e9-4f81-8884-facea30b2ffd.config.json
[2021-08-14 13:16:29,359] [1885|MainThread|meltano.cli.utils] [DEBUG] 'NoneType' object has no attribute 'full_refresh'
Traceback (most recent call last):
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/cli/__init__.py", line 44, in main
cli(obj={"project": None})
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/cli/params.py", line 23, in decorate
return func(*args, **kwargs)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/cli/params.py", line 56, in decorate
func(project, *args, **kwargs)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/cli/invoke.py", line 65, in invoke
handle = invoker.invoke(*plugin_args, command=command_name)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/core/plugin_invoker.py", line 272, in invoke
with self._invoke(*args, **kwargs) as (popen_args, popen_options, popen_env):
File "/usr/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/core/plugin_invoker.py", line 257, in _invoke
with self.plugin.trigger_hooks("invoke", self, args):
File "/usr/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/core/behavior/hookable.py", line 70, in trigger_hooks
self.__class__.trigger(self, f"before_{hook_name}", *args, **kwargs)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/core/behavior/hookable.py", line 97, in trigger
raise err
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/core/behavior/hookable.py", line 89, in trigger
hook_func(target, *args, **kwargs)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/core/plugin/singer/target.py", line 107, in setup_bookmark_writer_hook
self.setup_bookmark_writer(plugin_invoker)
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/core/plugin/singer/target.py", line 120, in setup_bookmark_writer
incomplete_state = elt_context.full_refresh and elt_context.select_filter
AttributeError: 'NoneType' object has no attribute 'full_refresh'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/visch/git/meltano/.venv/lib/python3.8/site-packages/meltano/cli/__init__.py", line 52, in main
raise CliError(str(err)) from err
meltano.cli.utils.CliError: 'NoneType' object has no attribute 'full_refresh'
'NoneType' object has no attribute 'full_refresh'
(.venv) visch@visch-ubuntu:~/git/meltano/testdir$ meltano --version
meltano, version 1.79.0```
| 2 |
7,603,319 | 91,736,373 |
2021-08-10 09:35:49.104
|
A file bundle is added twice via `--include-related` if its configuration does not contain the `update` field
|
### What is the current *bug* behavior?
When adding a plugin with `--include-related`, Meltano adds a related file bundle plugin twice if it does not include the `update` field in its schema. The file bundles installs correctly if `update` is given with any key-value pair.
### What is the expected *correct* behavior?
The file bundle should be added only once with `--include-related`, regardless of whether or not the `update` field is provided.
### Steps to reproduce
1. Define a custom file bundle plugin (e.g. `files-google-analytics`) sharing a namespace of some other plugin (e.g. `tap-google-analytics` with namespace `tap_google_analytics`) **WITHOUT** any `update` field
```yaml
files:
name: files-google-analytics
namespace: tap_google_analytics
repo: https://github.com/me/files-google-analytics
pip_url: https://github.com/me/files-google-analytics
```
1. Add the file bundle via `--include-related` (e.g. `meltano add extractor tap-google-analytics --include related`)
1. Observe file bundle (`files-google-analytics`) add twice, with two occurences of `Adding related file bundle 'files-google-analytics' to your Meltano project...` in the output log.
### Relevant logs and/or screenshots
*Installs correctly*
```yaml
files:
name: files-google-analytics
namespace: tap_google_analytics
update:
path/to/some/file: true
repo: https://github.com/me/files-google-analytics
pip_url: https://github.com/me/files-google-analytics
```
```yaml
files:
name: files-google-analytics
namespace: tap_google_analytics
update:
path/to/some/file: false
repo: https://github.com/me/files-google-analytics
pip_url: https://github.com/me/files-google-analytics
```
*Installs incorrectly*
```yaml
files:
name: files-google-analytics
namespace: tap_google_analytics
repo: https://github.com/me/files-google-analytics
pip_url: https://github.com/me/files-google-analytics
```
```yaml
files:
name: files-google-analytics
namespace: tap_google_analytics
update: {}
repo: https://github.com/me/files-google-analytics
pip_url: https://github.com/me/files-google-analytics
```
### Possible fixes
If `update` is not provided, Meltano [defaults it to an empty object (`{}`)](https://meltano.com/docs/plugins.html#update-extra), which I assume is the root cause of this bug. Specifying `update: {}` manually yields the same results.
### Further regression test
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 4 |
7,603,319 | 91,175,958 |
2021-07-30 22:57:22.692
|
Additional Metadata for PyPI
|
## Summary
Add more project metadata fields for PyPI.
## Proposed benefits
[//]: # (Concisely summarize the benefits this feature would bring to yourself and other users.)
- More resource links in our PyPI page: https://pypi.org/project/meltano/
- Better SourceRank score: https://libraries.io/pypi/meltano/sourcerank
## Proposal details
[//]: # (In as much detail as you are able, describe the feature you'd like to build or would like to see built.)
[According to the Poetry docs](https://python-poetry.org/docs/pyproject/), we seem to be missing a few optional metadata fields:
- `maintainers`
- `homepage` (https://meltano.com)
- `documentation` (https://meltano.com/docs)
- `keywords`
We can also [add custom URLs](https://python-poetry.org/docs/pyproject/#urls):
```toml
[tool.poetry.urls]
"Issue Tracker" = "https://gitlab.com/meltano/meltano/-/issues"
"Twitter" = "https://twitter.com/meltanodata/"
"Changelog" = "https://gitlab.com/meltano/meltano/-/blob/master/CHANGELOG.md"
"Slack" = "https://join.slack.com/t/meltano/shared_invite/zt-obgpdeba-7yrqKhwyMBfdHDXsZY8G7Q"
"GitHub Mirror" = "https://github.com/meltano/meltano"
"Youtube" = "https://www.youtube.com/meltano"
```
## Best reasons not to build
[//]: # (Will this negatively affect any existing functionality? Do you anticipate any breaking changes versus what may already be working today? Make the counter-argument to your proposal here.)
None that I can think of.
A similar issue for the SDK is https://gitlab.com/meltano/sdk/-/issues/175.
| 1 |
7,603,319 | 90,776,313 |
2021-07-24 17:25:42.516
|
Introduce top-level "Environments" within `meltano.yml` config
|
## Background
As Meltano becomes the end-to-end tool for more parts of dataops pipelines, there's a need to create a top-level construct and understanding of the project's SQL data repository (Snowflake, Spark, Athena, etc.) and that repository's corresponding "environments" such as "prod", "beta", "cicd", and "dev". Some of these environments may be dynamically keyed and fully "disposable", such as "cicd-<build_num>", and "dev-<person_id>".
## Today's Solutions
Today there are two or three places where a user needs to specify their data platform information - for instance "target-snowflake" config, "dbt profiles", and the analyze config, and possibly a "reverse-etl" connector such as "tap-snowflake". There's not yet a meltano-native construct for reusing this config.
## Proposed Solution
### Environment config as a top-level construct
We propose adding a top-level `environments:` config element. This could be a combination of "smart" and "passthrough" settings, where "smart" settings are known and expected by meltano, and "passthrough" settings are anything that the target (or dbt profile) may require.
### Environment Inheritance
To ease onboarding to support DRY principals - we should allow environments to inherit from each other. For instance, prod may be identical to dev except in the credentials and db name. Server name, and other connection-level parameters may be identical between these environments and we should allow the config to be reused - in the same way that the `inherits-from` construct supports this for plugins today.
### Strawman Sample
This is only intended for conversation - the exact spec will evolve based on input:
`meltano.yml`:
```yaml
project_id: ...
plugins: ...
schedules: ...
plugins:
extractors:
# this section unchanged - extractors referenced by name below in `sources`
- name: tap-github
namespace: tap_github
executable: tap-github
capabilities:
- state
- catalog
- discover
settings:
- name: searches
kind: array
- name: auth_token
kind: password
- name: start_date
value: '2010-01-01T00:00:00Z'
# ...
environments:
- name: prod
job_id_prefix: prod_ # will be appended with ${TAP_NAMESPACE} to make default job IDs
# The `destination` is the singular landing location ("target") for this environment
# Other environments inherit this destination and can customize either with `vars` or by overriding completely
default_loader: target-snowflake
enable_transforms: true # or false if the target does not support transforms (e.g. target-jsonl or target-s3-csv)
# Here we list all the sources that `prod` should contain
# These reference names declared under `plugins->extractors`
# Other environments may inherit this list or override it
default_extractors:
- tap-github
- tap-zendesk
- tap-slack
- tap-github
config:
plugins:
loaders:
- name: target-snowflake
config:
db_name: "${DB_NAME}" # References an environment-scoped env var
load_schema: "${TAP_NAMESPACE}_raw" # Dynamic from the name of the currently-executing tap
account: "${SNOWFLAKE_ACCOUNT}"
user: "${SNOWFLAKE_USER}"
role: prod_loader_role
# ...
env:
# Env vars can be declared here when multiple plugins need to reference the same values
DB_NAME: "Waffleshop_DB"
SNOWFLAKE_ACCOUNT: waffleshop.snowflakecomputing.net
SNOWFLAKE_USER: prod_loader
- name: beta
job_id_prefix: beta_ # will be appended with ${TAP_NAMESPACE} to make default job IDs
inherits_from: prod
config:
plugins:
loaders:
- name: target-snowflake
config:
role: beta_loader_role
env:
DB_NAME: "Waffleshop_Beta"
SNOWFLAKE_USER: beta_loader
# The following environments are "disposable", meaning they are intentionally keyed for temporarily use
- name: user_dev
job_id_prefix: dev_${USER}_ # will be appended with ${TAP_NAMESPACE} to make default job IDs
inherits_from: beta
config:
plugins:
loaders:
- name: target-snowflake
config:
role: beta_loader_role
load_schema: "${USER}dev_${TAP_NAMESPACE}_raw" # Dynamic based on the USER environment variable
env:
DB_NAME: "Development"
SNOWFLAKE_USER: "${USER}"
- name: cicd
job_id_prefix: cicd_b${GITHUB_BUILD_NUM}_ # will be appended with ${TAP_NAMESPACE} to make default job IDs
inherits_from: beta
sources:
config:
plugins:
extractors:
# Overrides to make CICD run faster using a subset of data
- name: tap-github
config:
searches:
- {name: "target-athena", query: "target-athena+fork:only"}
loaders:
- name: target-snowflake
config:
role: cicd_loader_role
load_schema: "raw_b${GITHUB_BUILD_NUM}_${TAP_NAMESPACE}" # Dynamic based on build number from Github CI pipeline
```
In the simplest interations, all of the environment's config would be available to invoked plugins via a name prefix such as `$MELTANO_ENV_{VAR_NAME}`, such as `$MELTANO_ENV_SNOWFLAKE_USER`, `$MELTANO_ENV_DB_NAME`. Importantly, since we expect to be toggling the environment back and forth at runtime, the generated environment variables would not have the name of the environment contained within the variable name. However, we probably could still expose the environment name itself in something like `$MELTANO_ENV_NAME`, in case a plugin has native configuration which can branch based on that name.
### Possibility for native understanding of needs of different plugin types
The above is a simple "passthrough" approach where these keywords might not be natively understood by Meltano. However, we may choose to create subprofiles (now or in a future iteration) where meltano could send slightly different config to a transformer versus what it would send to a loader.
## Best reason not to build
None I can think of. It is one level of additional config to setup, so we may want to ensure it is optional. Especially for EL-only pipelines where there are no further steps, we would not want a user to _necessarily_ need to configure an environment if they are happy with simple tap->target configuration as they perform today.
| 12 |
7,603,319 | 90,291,165 |
2021-07-14 20:04:12.901
|
"Yank" older meltano versions which are missing Python constraints
|
We have a problem right now that many users are installing meltano on Python 3.9 - which is not supported - but the install is moving forward anyway on older versions (specifically `1.67`) which were published without an explicit max supported Python version. The net result is (1) the new user gets a stale version of Meltano, and (2) they don't get any messaging about 3.9 being unsupported.
## Long-term mitigation needed
Although we are soon going to launch 3.9 support (#2545) this issue will resurface again in October when [Python 3.10 is expected to release](https://www.google.com/search?q=python+3.10+release+date).
## "Yanking" old versions
According to the PyPi docs:

- https://pypi.org/help/#yanked
- https://www.python.org/dev/peps/pep-0592/
## Risk of Resurgence Post-Fix
If we do the one-time yank of old versions, this issue should not resurface because all published versions have a max supported version.
| 1 |
7,603,319 | 90,037,653 |
2021-07-09 17:10:29.542
|
Start tracking open source licenses
|
We should start tracking the licenses of the open source packages Meltano depends on, to make sure Meltano can sitll be licensed Apache 2.
We can use https://docs.gitlab.com/ee/user/compliance/license_compliance/. It uses https://github.com/pivotal/LicenseFinder, which requires `requirements.txt`, which we don't have since we're using Poetry, but we should be able to generate that on the fly using the `SETUP_CMD` env var described in https://docs.gitlab.com/ee/user/compliance/license_compliance/#installing-custom-dependencies.
| 12 |
7,603,319 | 89,864,528 |
2021-07-06 22:42:10.315
|
Implement interface for meltano `validator` plugins
|
Related to #2454, we will need a spec for how validators should interface with meltano CLI.
As part of the spec definition, we're looking at how the following validators would work:
- Great Expectations (full)
- `dbt test` (native dbt feature)
- Stream maps
- SQLFluff - (potentially as this could be considered a code validator)
## Latest Proposal (2021-12-07):
Similar to the below from 2021-10-27, except that there is no "top level `tests:`" config. Instead, "plugins" `have` "commands" and "commands" `can be of type` "test". A command is of type `test` if its name starts with the word `test`.
## Older Proposal (2021-10-27):
* Discussion of the proposed spec from this comment thread: https://gitlab.com/meltano/meltano/-/issues/2838#note_682557413
1. Already today: **Any plugin type can have a `test` command, added inline with other commands using existing command syntax.**
1. Already today: Each test command is invocable directly and can perform its actions based on just environment vars and other context such as command line args.
1. New: **Any plugin with a `test` command can also have a `tests:` list entry in the top-level plugin config.** Each item in the list of `tests` can have the following properties:
1. `name` - optional, if test is to be referenced by name
2. `executable` - optional, overrides the executable of the default `test` command at the plugin level
3. `args` - optional, overrides the args of the `test` command at the plugin level
4. `env` (or `vars`) - a list of env variable name/value combinations to be passed to the executor
1. New: Tests could be invoked with any of the following syntaxes:
1. `meltano run <plugin>:test` - runs any and all tests defined in the plugin declaration.
- If the `test` command exists but no `tests:` declaration, only the one test command will be executed.
- If the `tests:` entry also exists, all enumerated tests will be run in order.
- If the `test` command does not exist, the command fails.
2. `meltano test --all` (a new cli command) runs all defined tests in all plugins.
3. `meltano test <plugin>` runs all defined tests for the named plugin.
3. `meltano test <plugin1> <plugin2>` runs all defined tests for the named plugins.
4. `meltano test <plugin>:<test-name>` runs a named test for the named plugin.
4. `meltano test <plugin1>:<test-name1> <plugin2>:<test-name2>` runs multiple named tests for the named plugins.
Future Considerations:
Sequencing would need to be provided explicitly in the `v1` release, but `depends_on` behavior could be added in a v2, perhaps after `environments` (#2869) is released.
<details><summary>Previous proposals, click to expand</summary>
## Option A: Dedicated CLI command `meltano validate` (confirmed first iteration)
This would run validations as a dedicated top-level command. How the integration is performed would be governed by (a) custom code in meltano library and/or (b) configuration specs in the `validators:` entry in `discovery.yml`.
## Option B: Integration with `meltano elt` (won't do, deprecating in favor of `meltano run`)
Could be enabled/disabled similar to how transforms are treated, with a CLI flag.
## Option C: Integration with (WIP) `meltano run` (will do, but in future iteration)
Could be listed in a pipeline definition such as:
`meltano run tap-salesforce target-snowflake dbt:test ge:test`
Given an invocation such as the above, Meltano may be able to infer that the two validators can run in parallel, and may automatically parallelize them.
</details>
| 12 |
7,603,319 | 89,795,001 |
2021-07-05 20:37:21.864
|
Document where domains/DNS is managed
|
We should document on https://meltano.com/handbook/engineering/ where our various domain names are registered (AWS, NameCheap, Gandi) and where DNS is managed (typically SiteGround).
Related to https://gitlab.com/gitlab-com/gl-infra/infrastructure/-/issues/12688
@aaronsteers I seem to recall you already created an issue for not, but if not, this can be it.
| 2 |
7,603,319 | 89,630,866 |
2021-07-01 18:33:23.964
|
Coroutine async issue with parallel installs version 1.77.0
|
We are installing meltano inside a Docker container with a fairly simple `pip install upgrade pip` + `pip install meltano` with the conditions:
Docker Image: Ubuntu 20.04
Python version: 3.8.5
Meltano version: 1.77.0 (but applies to 1.73.0 - 1.77.0)
and the command `meltano install` causes the following problem:
```
meltano install
Installing 3 plugins...
Installing loader 'target-mysql'...
Installing loader 'target-postgres'...
Installing extractor 'tap-s3-csv'...
'coroutine' object has no attribute 'read'
```
We are installing 1 Singer tap and 2 targets, not a particularly complicated setup. To solve this problem, we are currently using Meltano version 1.72.0. Note that I am totally new to Meltano, so I can't really propose comprehensive solutions. Please let me know if you need any more information.
UPDATE: That was the original situation but once I did `apt-get python3-venv`, all of these issues got solved and meltano version 1.77.0 was installed as it should be. So, maybe we just need better documentation on how to install Meltano in a custom docker container.
| 4 |
7,603,319 | 89,548,766 |
2021-06-30 14:42:27.059
|
meltano add extractor tap-google-analytics --include-related causes a plugin install error.
|
### What is the current *bug* behavior?
When running `meltano add extractor tap-google-analytics --include-related` you get an error `File transform/dbt_project.yml could not be found. Run meltano add files dbt to set up a dbt project.` and at the end of the install process you get messages saying `Installed 5/6 plugins` & `Failed to install plugin(s)`
The dbt files do still get added under in the transform dir.
This is using python 3.8.5 & Meltano version 1.77.0
### What is the expected *correct* behavior?
No error should be displaying.
### Steps to reproduce
1. `meltano init new_project`
2. Inside the new project run `meltano add extractor tap-google-analytics --include-related`
3. You get the error `File transform/dbt_project.yml' could not be found. Run `meltano add files dbt` to set up a dbt project.` and the summary `Installed 5/6 plugins` & `Failed to install plugin(s)`
4. Look inside the transform dir, and you'll see the dbt files that apparently failed to install.
### Relevant logs and/or screenshots
```
(venv) dw:~/Documents/test-google$ meltano add extractor tap-google-analytics --include-related
Added extractor 'tap-google-analytics' to your Meltano project
Variant: meltano (default)
Repository: https://gitlab.com/meltano/tap-google-analytics
Documentation: https://hub.meltano.com/extractors/google-analytics.html
Added related transform 'tap-google-analytics' to your Meltano project
Variant: meltano (default)
Repository: https://gitlab.com/meltano/dbt-tap-google-analytics
Added related transformer 'dbt' to your Meltano project
Repository: https://github.com/fishtown-analytics/dbt
Documentation: https://meltano.com/docs/transforms.html
Added related file bundle 'dbt' to your Meltano project
Repository: https://gitlab.com/meltano/files-dbt
Added related model 'model-google-analytics' to your Meltano project
Variant: meltano (default)
Repository: https://gitlab.com/meltano/model-google-analytics
Added related dashboard 'dashboard-google-analytics' to your Meltano project
Variant: meltano (default)
Repository: https://gitlab.com/meltano/dashboard-google-analytics
Installing dashboard 'dashboard-google-analytics'...
Installing model 'model-google-analytics'...
Installing file bundle 'dbt'...
Installing transformer 'dbt'...
Installing transform 'tap-google-analytics'...
Adding dbt package 'https://gitlab.com/meltano/dbt-tap-google-analytics.git@config-version-2' to 'transform/packages.yml'...
Adding dbt model to 'transform/dbt_project.yml'...
File transform/dbt_project.yml' could not be found. Run `meltano add files dbt` to set up a dbt project.
Installing extractor 'tap-google-analytics'...
Installed model 'model-google-analytics'
Installed file bundle 'dbt'
Adding 'dbt' files to project...
Created transform/dbt_project.yml
Created transform/.gitignore
Created transform/profile/profiles.yml
Created transform/models/.gitkeep
Installed dashboard 'dashboard-google-analytics'
Importing reports...
Imported report 'Audience Overview per month'
Imported report 'Relative change per month'
Imported report 'Active Users'
Imported report 'Audience Overview per day'
Importing dashboards...
Imported dashboard 'Google Analytics'
Installed extractor 'tap-google-analytics'
Installed transformer 'dbt'
Installed 5/6 plugins
Failed to install plugin(s)
```

### Possible fixes
_If you can, link to the line of code that might be responsible for the problem or suggest a fix_
### Further regression test
_Ensure we automatically catch similar issues in the future_
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 1 |
7,603,319 | 89,428,092 |
2021-06-28 19:15:43.878
|
Add tap-twilio as discoverable
| null | 1 |
7,603,319 | 89,415,214 |
2021-06-28 14:42:30.733
|
Update Slack Users to 1200+
| null | 1 |
7,603,319 | 89,198,678 |
2021-06-23 21:46:24.752
|
Add tap-ask-nicely
| null | 1 |
7,603,319 | 88,909,627 |
2021-06-18 02:53:36.149
|
Copy updated dbt and transform settings from discovery.yml due to dbt 0.19.1 upgrade
|
The following discussions from !2170 should be addressed:
- [ ] @tayloramurphy started a [discussion](https://gitlab.com/meltano/meltano/-/merge_requests/2170#note_602038447): (+3 comments)
> @aaronsteers will you also make an MR to update [discovery.yml](https://gitlab.com/meltano/hub/-/blob/main/_data/meltano/discovery.yml), [dbt.yml](https://gitlab.com/meltano/hub/-/blob/main/_data/meltano/transformers/dbt.yml), and the [transform yamls](https://gitlab.com/meltano/hub/-/tree/main/_data/meltano/transforms)?
| 2 |
7,603,319 | 88,885,342 |
2021-06-17 14:01:03.276
|
Generate a "run_results" artifact after any Meltano run
|
In discussion with Stephen Bailey from Immuta he brought up the idea of having some kind of run artifact similar to what dbt does with [their artifacts](https://docs.getdbt.com/reference/artifacts/dbt-artifacts#when-are-artifacts-produced).
At a minimum it would be good to have some kind of summary of an `elt` run to be able to ship anywhere, but I could also see us having the equivalent of a manifest generated as well. This would tie in well with https://gitlab.com/meltano/meltano/-/issues/2611
| 8 |
7,603,319 | 88,813,557 |
2021-06-16 13:20:23.312
|
Update location of simplified Spec
|
### What is the current *bug* behavior?
Links from google and the Meltano blog point to:
https://meltano.com/docs/singer-spec.html
which returns a 404 Not Found error. Specific blog post:
https://meltano.com/blog/2021/04/07/a-simplified-take-on-the-singer-spec/
### What is the expected *correct* behavior?
New URL is:
https://hub.meltano.com/singer/spec
| 1 |
7,603,319 | 88,599,822 |
2021-06-12 00:21:06.844
|
Publish dbt packages to https://hub.getdbt.com/
|
Publishing to dbtHub could give ~~2~~ 3 significant benefits:
1. Allow installation via semantic versioning, which could give a better versioning option vs the branch-pinning solution for `config-version-2` updates needed for #2694. (Semver range restrictions are only available for hub-hosted packages.)
2. Introduce dbt users to Meltano by surfacing our work there on https://hub.getdbt.com/
3. Attract users (and community contributors) for those packages.
@tayloramurphy, @DouweM - Thoughts?
Checklist:
- [ ] Documentation:
- [ ] Readmes should answer: "What is Meltano?" and "What is Singer?".
- [ ] Readmes should mention all streams need to be selected.
- [ ] Readmes should have a short walkthrough (and/or link to one) using meltano to sink from tap to target and run the transform.
- [ ] We should note which database(s) each package has been tested on. If possible, we should try to add Snowflake as well as Postgres.
- [ ] CI Testing
- [ ] We should require an end-to-end CI test for each before publishing to Meltano hub. (This could be the first Meltano experience for many new users, and I expect there could be a large demand/interest.)
- [ ] Migration:
- [ ] GitHub project migration or mirroring (barring GitLab support on Meltano side https://github.com/fishtown-analytics/hub.getdbt.com/issues/571)
| 8 |
7,603,319 | 88,544,583 |
2021-06-10 22:26:11.224
|
Safely skip `docker build` step (`meltano_dev`)
|
The `meltano_dev` step takes approximately 14-15 minutes, which is between 33 and 50 percent of average CI build times. We can reduce this time with one or more of the following methods:
1. Only rebuilding the docker image when dependencies change:
1. Create a hash of `pyproject.toml` and `poetry.lock` and use a combination/hash of those hashes as the docker tag.
2. Before running the docker build, check if the docker hash already exists as a tag. If so, reuse the existing image in subsequent steps (skips the build time).
3. Create a manual build step which can be used to force docker image build a build even when nothing has changed.
2. Running 'fail-early' tests redundantly without a fully-rebuilt docker image.
1. Start with a stock python image and/or the image from the last stable master build.
2. Run `poetry install` - which should finish immediately if there's nothing to do.
3. Run at minimum `pytest [3.8]` and `python_lint [3.8]` using the older image. The results should mostly be reduntent with the similar tests, except they should run (and fail) about 12 minutes faster.
4. As with the above, if we choose to use the last master image rather than a stock python 3.8 image, we will want a manual build step option to replace whatever prior/cached image with the latest on the current branch.
Any other options/strategies I'm missing?
| 8 |
7,603,319 | 88,406,866 |
2021-06-08 16:10:06.672
|
Docs for ephemeral production pipelines running (incrementally) with no Postgres backend
|
In creating the pipeline for the Hub, I did not find docs for how to productionalize a pipeline while not requiring a postgres database. I was thinking of adding a short guide in `Tutorials` or in `Productionalization`.
Here are the related resources I found:
- How to use the `state` extra: https://meltano.com/docs/plugins.html#state-extra
- `--dump` option with example: https://meltano.com/docs/command-line-interface.html#examples
I think the process we want for the Gitlab pipeline is:
1. Load `state.json` from external storage (in our case, either S3 or a Gitlab CI artifact store.)
2. Ensure the `state` extra is configured.
3. ~~**To confirm:** during execution, `state.json` should hopefully be continually updated by meltano. (I think _we do not_ need the `--dump` option, in this case.)~~ During load, state is incremented updated in the system db (only).
- State must be exported by running a second invocation with the `--dump=state` CLI option.
4. Catch failures and upload the new state regardless, so that incremental processing will continue on the next run even if the operation is interrupted.
5. Still raise a failure (return non-zero) if the pipeline fails, so the job status reflects the execution result.
| 4 |
7,603,319 | 88,341,741 |
2021-06-07 17:46:11.086
|
Add Lightdash as a utility plugin
|
This is a new BI tool: open source, (apparently) CI-friendly, and integrated with dbt.
https://www.lightdash.com/
License: MIT
Sample code (inline with a dbt `schema.yml` file):

From https://docs.lightdash.com/guides/how-to-create-metrics
| 4 |
7,603,319 | 88,288,543 |
2021-06-07 01:25:28.832
|
catalog file is invalid: 'NoneType' object has no attribute 'startswith'
|
Slack thread:https://meltano.slack.com/archives/C01TCRBBJD7/p1623028702188300
```
[2021-06-06 21:13:58,784] [20702|MainThread|meltano.cli.utils] [DEBUG] Applying catalog rules failed: catalog file is invalid: 'NoneType' object has no attribute 'startswith'
Traceback (most recent call last):
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/plugin/singer/tap.py", line 351, in apply_catalog_rules
cache_key = self.catalog_cache_key(plugin_invoker)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/plugin/singer/tap.py", line 372, in catalog_cache_key
if plugin_invoker.plugin.pip_url.startswith("-e"):
AttributeError: 'NoneType' object has no attribute 'startswith'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/cli/__init__.py", line 44, in main
cli(obj={"project": None})
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/cli/params.py", line 23, in decorate
return func(*args, **kwargs)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/cli/params.py", line 56, in decorate
func(project, *args, **kwargs)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/cli/invoke.py", line 65, in invoke
handle = invoker.invoke(*plugin_args, command=command_name)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/plugin_invoker.py", line 244, in invoke
with self._invoke(*args, **kwargs) as (popen_args, popen_options, popen_env):
File "/usr/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/plugin_invoker.py", line 229, in _invoke
with self.plugin.trigger_hooks("invoke", self, args):
File "/usr/lib/python3.8/contextlib.py", line 113, in __enter__
return next(self.gen)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/behavior/hookable.py", line 70, in trigger_hooks
self.__class__.trigger(self, f"before_{hook_name}", *args, **kwargs)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/behavior/hookable.py", line 97, in trigger
raise err
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/behavior/hookable.py", line 89, in trigger
hook_func(target, *args, **kwargs)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/plugin/singer/tap.py", line 303, in apply_catalog_rules_hook
self.apply_catalog_rules(plugin_invoker, exec_args)
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/core/plugin/singer/tap.py", line 365, in apply_catalog_rules
raise PluginExecutionError(
meltano.core.plugin.error.PluginExecutionError: Applying catalog rules failed: catalog file is invalid: 'NoneType' object has no attribute 'startswith'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/visch/git/oracle2mssql/.venv/lib/python3.8/site-packages/meltano/cli/__init__.py", line 52, in main
raise CliError(str(err)) from err
meltano.cli.utils.CliError: Applying catalog rules failed: catalog file is invalid: 'NoneType' object has no attribute 'startswith'
Applying catalog rules failed: catalog file is invalid: 'NoneType' object has no attribute 'startswith'
```
Meltano.yml
```
- name: tap-name
namespace: tap_oracle
#pip_url: git+https://github.com/transferwise/pipelinewise-tap-oracle
executable: /home/visch/git/tap-oracle/venv/bin/tap-oracle
```
If you change this to
```
- name: tap-name
namespace: tap_oracle
pip_url: fake
executable: /home/visch/git/tap-oracle/venv/bin/tap-oracle
```
everything works
| 2 |
7,603,319 | 88,229,562 |
2021-06-04 16:17:12.203
|
health check return 401 when MELTANO_AUTHENTICATION is set to true
|
<!---
Please read this!
Before opening a new issue, make sure to search for keywords in the issues
filtered by the "regression" or "bug" label and verify the issue you're about to submit isn't a duplicate.
If you are submitting an issue with a tap, please include:
- account details
- target details
- entities selected with meltano select (if you have selected any entities), as the bug may be related to a specific entity
- the full elt command you are running
- full output of the meltano elt command. Logs can get pretty long, so you can add the full log as a snippet in the Meltano project and add a link in the issue.
--->
### What is the current *bug* behavior?
_What is happening now?_
health check (/api/v1/health) return 401 when MELTANO_AUTHENTICATION is set to true.
### What is the expected *correct* behavior?
_What should be happening?_
it should return a 200 without being authentificated.
### Steps to reproduce
_How one can reproduce the issue?_
set MELTANO_AUTHENTICATION to true, run meltano ui and then check the response at http://host:port/api/v1/health
### Relevant logs and/or screenshots
_Please use code blocks (\`\`\`) to format console output_
i will add it later.
### Possible fixes
_If you can, link to the line of code that might be responsible for the problem or suggest a fix_
### Further regression test
_Ensure we automatically catch similar issues in the future_
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 8 |
7,603,319 | 88,122,570 |
2021-06-02 19:36:06.165
|
Minor tech debt from pre-commit, flakehell, and flake8
|
As noted in https://gitlab.com/meltano/meltano/-/merge_requests/2150#note_591431777, there's some minor tech debt I want to document for future review.
1. [ ] `pre-commit` hooks are not version locked with poetry
2. [ ] `flakehell` uses a private interface of `flake8`, which broke when `flake8` released `3.9.1` (also for `3.9.2` and likely future releases as well).
3. [ ] Workaround added in https://gitlab.com/meltano/meltano/-/merge_requests/2150 adds a `extended_default_ignore=[]` declaration into `pyproject.toml`, which fixes the issue.
In future, it would be good to see if we can sync the pre-commit hooks with poetry (if possible) and/or other mitigations.
| 20 |
7,603,319 | 87,892,634 |
2021-05-28 21:21:07.292
|
Emit structured (JSON) logs
|
All of my meltano logs end up in datadog. If the logs were formatted as json, it would automatically break up each log message into its various components. This is possible to do with a custom regex as well, but not quite as convenient.
| 8 |
7,603,319 | 87,892,108 |
2021-05-28 20:57:25.017
|
Update stale `meltano add --custom` doc example
|
I missed this during the release and will need to follow up. The docs are correct in how they instruct the user to proceed, but the example invocation of the `add --custom` command is stale by at least a couple releases.
Search `meltano add --custom extractor tap-covid-19` to find the sample - and update the page with the latest cli prompts and input values.
| 2 |
7,603,319 | 87,771,495 |
2021-05-26 19:22:54.447
|
Change default behavior in `meltano config ... set` when `kind` is missing
|
We should considering altering how Meltano behaves when `kind` is missing from specific settings. The default behavior when `kind` is not know would be to treat the value as insensitive. We may instead want to prompt the user to specify the `kind` and/or the sensitivity level of an otherwise undeclared setting value.
For instance, prompting a user to specify it if configuring with `meltano config <plugin> set <setting> <value>. Especially for custom taps (including many taps discovered from the hub), the `kind` and therefor the sensitivity level is not known, and we should probably not automatically default to that value _not_ being a secret.
This could also serve as a catch for typos or accidentally-incorrect setting names: rather than going through the wizard's steps and storing under the provided setting name, the user may realize they've input an incorrect setting name and abort, then retry the process with the correct setting name.
Here's an example invocation. The result of the below (using an unknown/incorrect setting name) is that the setting gets stored in yaml as clear text:
```
% poetry run meltano config target-jsonl set myset thisisnotasecret
Loader 'target-jsonl' setting 'myset' was set in `meltano.yml`: 'thisisnotasecret'
```
## Proposal
- We should at minimum warn the user that `kind` is missing.
- If an interactive prompt is available, we can ask the user interactively what the proper `kind` value is _before_ serializing the new setting value.
- The setting's `kind` entry would be serialized to yaml as well.
- We can add a `--quiet` flag optionally, to mute the new behavior and treat the setting as `kind: string` if not set. (Same as current status quo.)
## Proposal (Updated 2022-04-16)
- Combining with #3227+:
- If `--interactive` is used, we can _also_ prompt for setting config (including `kind`) at the same time as prompting for the new value.
- If `--interactive` is not set, we can fail with a prompt to use `--interactive` _or_ we can automatically switch to `--interactive` mode if `kind` is missing or the setting is not yet defined.
| 4 |
7,603,319 | 87,268,193 |
2021-05-18 09:10:49.995
|
`meltano invoke airflow` fails with `StrictVersion` error when Airflow is misconfigured
|
When LocalExecutor set w/ default SQLite meltano gives StrictVersion object has no attribute 'version' error.
The actual message should be something like that: "error: cannot use SQLite with the LocalExecutor."
| 4 |
7,603,319 | 87,229,805 |
2021-05-17 20:00:29.571
|
Document how we're deploying Meltano for the hub
|
With https://gitlab.com/meltano/hub/-/issues/15 we're truly dogfooding Meltano (https://gitlab.com/meltano/meta/-/issues/4). This is a great opportunity to create some relevant content around this effort.
- [ ] Blog Post on how we're using Meltano to build the Hub
- [ ] Documentation update in several areas:
- [ ] Cross-link in https://meltano.com/docs/production.html
- [ ] We should have a section on the Hub on how we're building the hub "Architecture"
- [ ] Create a "Production Deployments in the Wild" section or something like that within the Meltano documentation. We can link ours and the GitLab data team's.
| 4 |
7,603,319 | 87,118,247 |
2021-05-15 05:39:44.192
|
Melano cache error noise after a tap-postgres elt
|
<!---
Please read this!
Before opening a new issue, make sure to search for keywords in the issues
filtered by the "regression" or "bug" label and verify the issue you're about to submit isn't a duplicate.
If you are submitting an issue with a tap, please include:
- account details
- target details
- entities selected with meltano select (if you have selected any entities), as the bug may be related to a specific entity
- the full elt command you are running
- full output of the meltano elt command. Logs can get pretty long, so you can add the full log as a snippet in the Meltano project and add a link in the issue.
--->
### What is the current *bug* behavior?
_What is happening now?_
### What is the expected *correct* behavior?
_What should be happening?_
### Steps to reproduce
Run a postgres to redshift ELT with key-based replication.
### Relevant logs and/or screenshots

### Possible fixes
_If you can, link to the line of code that might be responsible for the problem or suggest a fix_
### Further regression test
_Ensure we automatically catch similar issues in the future_
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 4 |
7,603,319 | 87,104,909 |
2021-05-14 18:02:00.150
|
New `meltano state` command to rename, alter, print, and copy job states
|
### Problem to solve
Sometimes it's nice to be able to change the `job_id` for a particular pipeline, perhaps for example a second version of a plugin is added (you started with `tap-google-analytics` but now have `tap-google-analytics-site1` and `tap-google-analytics-site2`) and you need to disambiguate, or you're still figuring out naming conventions as we are.
If we just change it in `meltano.yml`, Meltano will see new runs as a new job and not find the existing state.
### Further details
As a workaround, we can go into the Meltano database and mess with the `job` table, but this is annoying and error prone.
### Proposal
I see a few directions here. We could add a command like `meltano job rename <old_job_id> <new_job_id>`. `meltano job` could also later include `list`, which might be helpful for building a `tap-meltano` in the future if you want to use Meltano to track pipelines. Alternate vocabulary could be `runs` instead of `jobs` as it seems like the codebase and docs are preferring that term.
Another way to do it would be to make a `meltano state` command to manage state. It could have `meltano state get <job_id>`, `meltano state set <job_id> <state>`, `meltano state reset <job_id>` (see #2568+). Then renaming could be something like `meltano state get <old_job_id> | meltano state set <new_job_id>`. Pipe syntax would be handy if you wanted to use a tool like `jq` to mutate states in complex ways.
Both of these options use a noun command as a verb command doesn't seem to make sense here. Currently Meltano seems to use a mix of verbs (`discover`, `init`, `install`, `upgrade`) and nouns (`user`, `schema`, `config`).
### Proposal Update (2022-01-31)
_(Appended by AJ.)_
To manage state migration, we'd add `get`, `set`, `clear`, and `copy` options a new `meltano state` command.
1. `list`
- `meltano state list` (output list of available Job IDs to STDOUT)
1. `set`
- `meltano state set <JOB-ID> <state-json-text>` (input from text string or STDIN)
- `meltano state set <JOB-ID> --input-file=<json-state-file>` (input from file)
- `cat state.json | meltano state set <JOB-ID>` (input from text string or STDIN)
1. `show` (or `get`)
- `meltano state show <JOB-ID>` (output to STDOUT)
- `meltano state show <JOB-ID> > <json-state-file>` (output to file)
1. `clear` (or `reset`)
- `meltano state clear <JOB-ID>` (or `meltano state reset`)
1. `copy`
- `meltano state copy <JOB-ID-1> <JOB-ID-2>`
1. `move`
- `meltano state move <JOB-ID-1> <JOB-ID-2>` (or `meltano state rename`)
1. `merge`
- `meltano state merge <FROM-JOB-ID> <TO-JOB-ID>` (update job 2 state, replacing and merging status from job 1)
### First iteration
As a first iteration, I think we'd just need items `#1-3`, which are `list`, `set`, and `show` - and optionally the fourth subcommand `clear`. With those three or four subcommands, users could accomplish the other functions in a multi-step processes. For instance, instead of `copy`, the user could run something like `meltano state show <OLD-ID> | meltano state set` and accomplish the same effect. Similarly, a `rename` operation could have the same workaround as `copy` except also running a manual `clear` after the copy.
| 12 |
7,603,319 | 86,813,182 |
2021-05-10 22:00:55.190
|
Entity selection does not work when stream names have periods in them
|
### What is the current *bug* behavior?
I’ve installed a custom plugin for redshift as an extractor (`pip install tap-redshift`). I had to do some debugging of their code but I got it working all except for the entity selection. When I do a `meltano select tap-redshift --list --all`, I get a list of all the tables in the `schema` set in my config. However, when I try to add model selection to the `meltano.yml` file using the expected syntax (i.e., `database.schema.table.*`), I can’t get any of the entities in that list highlighted in green (‘selected’) no matter what I do. It almost feels like the select cli command doesn’t play nice with custom plugins.
### What is the expected *correct* behavior?
If I enter
```
select:
- database.schema.table.*
```
I would expect all columns in the listed table to be selected in green ('selected') after running a `meltano select --list --all` commmand.
### Steps to reproduce
Here's my `meltano.yml`
```
version: 1
plugins:
extractors:
- name: tap-redshift
namespace: tap_redshift
pip_url: tap-redshift
executable: tap-redshift
capabilities:
- config
- discover
- catalog
- properties
config:
host: ############
port: 5439
dbname: matillion
schema: referencedata
start_date: 1900-01-01T01:01:01Z
user: ########
password: #########
select:
- 'matillion.referencedata.t_ref_score_range.*'
metadata:
'*':
replication-method: FULL_TABLE
```
### Relevant logs and/or screenshots
```
% meltano select tap-redshift --list --all
Legend:
selected
excluded
automatic
Enabled patterns:
matillion.referencedata.t_ref_score_range.*
Selected attributes:
[excluded ] matillion.referencedata.t_ref_code_set.code_set_code
[excluded ] matillion.referencedata.t_ref_code_set.code_set_id
[excluded ] matillion.referencedata.t_ref_code_set.code_set_name
[excluded ] matillion.referencedata.t_ref_code_set.date_modified
[excluded ] matillion.referencedata.t_ref_code_set.modified_by
[excluded ] matillion.referencedata.t_ref_code_set.owner_name
[excluded ] matillion.referencedata.t_ref_code_set.steward_name
[excluded ] matillion.referencedata.t_ref_score_range.date_modified
[excluded ] matillion.referencedata.t_ref_score_range.lower_bound
[excluded ] matillion.referencedata.t_ref_score_range.metric_name
[excluded ] matillion.referencedata.t_ref_score_range.score_range_id
[excluded ] matillion.referencedata.t_ref_score_range.score_value
[excluded ] matillion.referencedata.t_ref_score_range.upper_bound
```
On a separate note, the workaround I found was to use quotes and ignore the first couple of dots in the entity name. Specifically:
```
select:
- '*t_ref_score_range.*'
```
gets me exactly what I want.
### Possible fixes
_If you can, link to the line of code that might be responsible for the problem or suggest a fix_
Not sure
| 4 |
7,603,319 | 86,699,604 |
2021-05-08 04:35:49.969
|
Incorrect docs link after `meltano add transformer dbt`
|
After installing dbt, the docs link provided is `https://meltano.com/docs/transforms.md`. The correct link is: `https://meltano.com/docs/transforms.html`
```
ajsteers@ajs-macbook-pro meltano % meltano add transformer dbt
Transformer 'dbt' already exists in your Meltano project
To add it to your project another time so that each can be configured differently,
add a new plugin inheriting from the existing one with its own unique name:
meltano add transformer dbt--new --inherit-from dbt
Installing transformer 'dbt'...
Installed transformer 'dbt'
To learn more about transformer 'dbt', visit https://meltano.com/docs/transforms.md
```
| 1 |
7,603,319 | 86,543,574 |
2021-05-05 18:18:49.059
|
Meltano ELT fails when running on native windows
|
(needs the Windows Support tag)
To replicate on windows
```
PS C:\code\meltano-windowstest2> meltano --version
meltano, version 1.73.0
PS C:\code\meltano-windowstest2> $Env:PYTHONASYNCIODEBUG=1
PS C:\code\meltano-windowstest2> meltano --log-level=debug elt tap-csv target-csv
```
Output from Meltano command (Note that the data in this instance actually does properly flow to my output file, but the program is stuck and I have to signal the process (CTRL+C) to stop the program, which then leads to the exceptions being dumped. Also note that I tried to dump the output to a log file instead of my command prompt output but that didn't include the exception information from asyncio (`>log 2>&1`) after the CTRL+C was sent so this is just a straight copy from my powershell window instead.
The exceptions not being printed is probably caused by `loop.set_exception_handler` not being utilized. That could probably be a separate issue, and would help some folks with debugging asyncio. I only stumbled on this because I mess up code all the time and was upset I couldn't see RunTime Exceptions.
**Output from a "successful" run**
```
PS C:\code\meltano-windowstest2> meltano --version
meltano, version 1.73.0
PS C:\code\meltano-windowstest2> $Env:PYTHONASYNCIODEBUG=1
PS C:\code\meltano-windowstest2> meltano --log-level=debug elt tap-csv target-csv
[2021-05-05 14:12:21,076] [12024|MainThread|root] [DEBUG] Creating engine <meltano.core.project.Project object at 0x000001A94C457970>@sqlite:///C:\code\meltano-windowstest2/.meltano/meltano.db
[2021-05-05 14:12:21,110] [12024|MainThread|asyncio] [DEBUG] Using proactor: IocpProactor
meltano | DEBUG Starting new HTTPS connection (1): www.meltano.com:443
meltano | DEBUG https://www.meltano.com:443 "GET /discovery.yml?project_id=1281de93-235e-4185-85ce-2b3555a991ae HTTP/1.1" 200 103053
meltano | DEBUG Variable '$MELTANO_LOAD_SCHEMA' is missing from the environment.
meltano | INFO Running extract & load...
meltano | DEBUG Created configuration at C:\code\meltano-windowstest2\.meltano\run\elt\2021-05-05T181221--tap-csv--target-csv\1aec1d92-7fb9-4e86-885f-f71ddf7f6db1\tap.config.json
meltano | DEBUG Could not find tap.properties.json in C:\code\meltano-windowstest2\.meltano\extractors\tap-csv\tap.properties.json, skipping.
meltano | DEBUG Could not find tap.properties.cache_key in C:\code\meltano-windowstest2\.meltano\extractors\tap-csv\tap.properties.cache_key, skipping.
meltano | DEBUG Could not find state.json in C:\code\meltano-windowstest2\.meltano\extractors\tap-csv\state.json, skipping.
meltano | DEBUG Variable '$MELTANO_LOAD_SCHEMA' is missing from the environment.
meltano | DEBUG Variable '$MELTANO_LOAD_SCHEMA' is missing from the environment.
meltano | DEBUG Created configuration at C:\code\meltano-windowstest2\.meltano\run\elt\2021-05-05T181221--tap-csv--target-csv\1aec1d92-7fb9-4e86-885f-f71ddf7f6db1\target.config.json
meltano | WARNING No state was found, complete import.
meltano | DEBUG Invoking: ['C:\\code\\meltano-windowstest2\\.meltano\\extractors\\tap-csv\\venv\\Scripts\\tap-csv', '--config', 'C:\\code\\meltano-windowstest2\\.meltano\\run\\elt\\2021-05-05T181221--tap-csv--target-csv\\1aec1d92-7fb9-4e86-885f-f71ddf7f6db1\\tap.config.json', '--discover']
meltano | DEBUG Env: REDACTED
meltano | WARNING A catalog file was found, but it will be ignored as the extractor does not advertise the `catalog` or `properties` capability
meltano | DEBUG Invoking: ['C:\\code\\meltano-windowstest2\\.meltano\\extractors\\tap-csv\\venv\\Scripts\\tap-csv', '--config', 'C:\\code\\meltano-windowstest2\\.meltano\\run\\elt\\2021-05-05T181221--tap-csv--target-csv\\1aec1d92-7fb9-4e86-885f-f71ddf7f6db1\\tap.config.json']
meltano | DEBUG Env: REDACTED
meltano | DEBUG execute program 'C:\\code\\meltano-windowstest2\\.meltano\\extractors\\tap-csv\\venv\\Scripts\\tap-csv' stdout=<pipe> stderr=<pipe>
meltano | DEBUG process 'C:\\code\\meltano-windowstest2\\.meltano\\extractors\\tap-csv\\venv\\Scripts\\tap-csv' created: pid 7140
meltano | WARNING Executing <Task pending name='Task-1' coro=<_run_job() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py:214> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001A9533305E0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:422> cb=[_run_until_complete_cb() at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:184] created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py:37> took 0.969 seconds
meltano | DEBUG <_ProactorReadPipeTransport fd=4>: Fatal read error on pipe transport
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 299, in _loop_reading
self._read_fut = self._loop._proactor.recv(self._sock, 32768)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 445, in recv
self._register_with_iocp(conn)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 718, in _register_with_iocp
_overlapped.CreateIoCompletionPort(obj.fileno(), self._iocp, 0, 0)
OSError: [WinError 87] The parameter is incorrect
meltano | ERROR Exception in callback _ProactorReadPipeTransport._loop_reading()
handle: <Handle _ProactorReadPipeTransport._loop_reading() created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py:187>
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 316, in run_forever
super().run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 570, in run_forever
self._run_once()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1851, in _run_once
handle._run()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 201, in _read_from_fd
read_transport, _ = await loop.connect_read_pipe(
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1525, in connect_read_pipe
transport = self._make_read_pipe_transport(pipe, protocol, waiter)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 664, in _make_read_pipe_transport
return _ProactorReadPipeTransport(self, sock, protocol, waiter, extra)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 187, in __init__
self._loop.call_soon(self._loop_reading)
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 299, in _loop_reading
self._read_fut = self._loop._proactor.recv(self._sock, 32768)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 445, in recv
self._register_with_iocp(conn)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 718, in _register_with_iocp
_overlapped.CreateIoCompletionPort(obj.fileno(), self._iocp, 0, 0)
OSError: [WinError 87] The parameter is incorrect
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 309, in _loop_reading
self._fatal_error(exc, 'Fatal read error on pipe transport')
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 131, in _fatal_error
self._force_close(exc)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 134, in _force_close
if self._empty_waiter is not None and not self._empty_waiter.done():
AttributeError: '_ProactorReadPipeTransport' object has no attribute '_empty_waiter'
meltano | DEBUG <_ProactorReadPipeTransport fd=6>: Fatal read error on pipe transport
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 299, in _loop_reading
self._read_fut = self._loop._proactor.recv(self._sock, 32768)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 445, in recv
self._register_with_iocp(conn)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 718, in _register_with_iocp
_overlapped.CreateIoCompletionPort(obj.fileno(), self._iocp, 0, 0)
OSError: [WinError 87] The parameter is incorrect
meltano | ERROR Exception in callback _ProactorReadPipeTransport._loop_reading()
handle: <Handle _ProactorReadPipeTransport._loop_reading() created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py:187>
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 316, in run_forever
super().run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 570, in run_forever
self._run_once()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1851, in _run_once
handle._run()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 201, in _read_from_fd
read_transport, _ = await loop.connect_read_pipe(
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1525, in connect_read_pipe
transport = self._make_read_pipe_transport(pipe, protocol, waiter)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 664, in _make_read_pipe_transport
return _ProactorReadPipeTransport(self, sock, protocol, waiter, extra)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 187, in __init__
self._loop.call_soon(self._loop_reading)
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 299, in _loop_reading
self._read_fut = self._loop._proactor.recv(self._sock, 32768)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 445, in recv
self._register_with_iocp(conn)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 718, in _register_with_iocp
_overlapped.CreateIoCompletionPort(obj.fileno(), self._iocp, 0, 0)
OSError: [WinError 87] The parameter is incorrect
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 309, in _loop_reading
self._fatal_error(exc, 'Fatal read error on pipe transport')
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 131, in _fatal_error
self._force_close(exc)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 134, in _force_close
if self._empty_waiter is not None and not self._empty_waiter.done():
AttributeError: '_ProactorReadPipeTransport' object has no attribute '_empty_waiter'
meltano | DEBUG Read pipe 4 connected: (<_ProactorReadPipeTransport fd=4>, <asyncio.streams.StreamReaderProtocol object at 0x000001A9530B7B20>)
meltano | DEBUG Read pipe 6 connected: (<_ProactorReadPipeTransport fd=6>, <asyncio.streams.StreamReaderProtocol object at 0x000001A9530B7CD0>)
meltano | DEBUG Read pipe 836 connected: (<_ProactorReadPipeTransport fd=836 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953203dc0> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>, <ReadSubprocessPipeProto fd=1 pipe=<_ProactorReadPipeTransport fd=836 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953203dc0> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>>)
meltano | DEBUG Read pipe 876 connected: (<_ProactorReadPipeTransport fd=876 read=<_OverlappedFuture pending overlapped=<pending, 0x1a9531180d0> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>, <ReadSubprocessPipeProto fd=2 pipe=<_ProactorReadPipeTransport fd=876 read=<_OverlappedFuture pending overlapped=<pending, 0x1a9531180d0> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>>)
meltano | INFO execute program 'C:\\code\\meltano-windowstest2\\.meltano\\extractors\\tap-csv\\venv\\Scripts\\tap-csv': <_WindowsSubprocessTransport pid=7140 running stdout=<_ProactorReadPipeTransport fd=836 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953203dc0> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>> stderr=<_ProactorReadPipeTransport fd=876 read=<_OverlappedFuture pending overlapped=<pending, 0x1a9531180d0> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>>
meltano | DEBUG Invoking: ['C:\\code\\meltano-windowstest2\\.meltano\\loaders\\target-csv\\venv\\Scripts\\target-csv', '--config', 'C:\\code\\meltano-windowstest2\\.meltano\\run\\elt\\2021-05-05T181221--tap-csv--target-csv\\1aec1d92-7fb9-4e86-885f-f71ddf7f6db1\\target.config.json']
meltano | DEBUG Env: REDACTED
meltano | DEBUG execute program 'C:\\code\\meltano-windowstest2\\.meltano\\loaders\\target-csv\\venv\\Scripts\\target-csv' stdin=<pipe> stdout=<pipe> stderr=<pipe>
meltano | DEBUG process 'C:\\code\\meltano-windowstest2\\.meltano\\loaders\\target-csv\\venv\\Scripts\\target-csv' created: pid 18608
meltano | DEBUG Write pipe 904 connected: (<_ProactorWritePipeTransport fd=904 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375430> cb=[_ProactorWritePipeTransport._pipe_closed()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>, <WriteSubprocessPipeProto fd=0 pipe=<_ProactorWritePipeTransport fd=904 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375430> cb=[_ProactorWritePipeTransport._pipe_closed()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>>)
meltano | DEBUG Read pipe 968 connected: (<_ProactorReadPipeTransport fd=968 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375550> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>, <ReadSubprocessPipeProto fd=1 pipe=<_ProactorReadPipeTransport fd=968 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375550> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>>)
meltano | DEBUG Read pipe 920 connected: (<_ProactorReadPipeTransport fd=920 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375670> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>, <ReadSubprocessPipeProto fd=2 pipe=<_ProactorReadPipeTransport fd=920 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375670> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>>)
meltano | INFO execute program 'C:\\code\\meltano-windowstest2\\.meltano\\loaders\\target-csv\\venv\\Scripts\\target-csv': <_WindowsSubprocessTransport pid=18608 running stdin=<_ProactorWritePipeTransport fd=904 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375430> cb=[_ProactorWritePipeTransport._pipe_closed()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>> stdout=<_ProactorReadPipeTransport fd=968 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375550> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>> stderr=<_ProactorReadPipeTransport fd=920 read=<_OverlappedFuture pending overlapped=<pending, 0x1a953375670> cb=[_ProactorReadPipeTransport._loop_reading()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py:465>>>
tap-csv | INFO Starting sync
tap-csv | INFO Syncing entity 'things' from file: 'input.csv'
tap-csv (out) | {"type": "SCHEMA", "stream": "things", "schema": {"type": "object", "properties": {"id": {"type": "string"}, " name": {"type": "string"}}}, "key_properties": ["id"]}
tap-csv (out) | {"type": "RECORD", "stream": "things", "record": {"id": "1", " name": " Derek"}}
tap-csv (out) | {"type": "RECORD", "stream": "things", "record": {"id": "2", " name": " Cener"}}
tap-csv (out) | {"type": "STATE", "value": {}}
tap-csv | INFO Sync completed
meltano | DEBUG <_ProactorReadPipeTransport fd=876> received EOF
meltano | DEBUG <_ProactorReadPipeTransport fd=836> received EOF
meltano | INFO <_WindowsSubprocessTransport pid=7140 running stdout=<_ProactorReadPipeTransport closing fd=836> stderr=<_ProactorReadPipeTransport closing fd=876>> exited with return code 0
target-csv | INFO Sending version information to singer.io. To disable sending anonymous usage data, set the config parameter "disable_collection" to true
target-csv (out) | {}
meltano | INFO Incremental state has been updated at 2021-05-05 18:12:22.353325.
meltano | DEBUG Incremental state: {}
meltano | DEBUG <_ProactorReadPipeTransport fd=968> received EOF
meltano | DEBUG <_ProactorReadPipeTransport fd=920> received EOF
meltano | INFO <_WindowsSubprocessTransport pid=18608 running stdin=<_ProactorWritePipeTransport closed> stdout=<_ProactorReadPipeTransport closing fd=968> stderr=<_ProactorReadPipeTransport closing fd=920>> exited with return code 0
meltano | DEBUG Deleted configuration at C:\code\meltano-windowstest2\.meltano\run\elt\2021-05-05T181221--tap-csv--target-csv\1aec1d92-7fb9-4e86-885f-f71ddf7f6db1\target.config.json
meltano | DEBUG Deleted configuration at C:\code\meltano-windowstest2\.meltano\run\elt\2021-05-05T181221--tap-csv--target-csv\1aec1d92-7fb9-4e86-885f-f71ddf7f6db1\tap.config.json
meltano | INFO Extract & load complete!
meltano | INFO Transformation skipped.
Aborted!
Exception ignored in: <coroutine object _run_job at 0x000001A9530120C0>
RuntimeError: coroutine ignored GeneratorExit
meltano | ERROR Task was destroyed but it is pending!
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\derek\AppData\Local\Programs\Python\Python38\Scripts\meltano.exe\__main__.py", line 7, in <module>
sys.exit(main())
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\__init__.py", line 43, in main
cli(obj={"project": None})
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 23, in decorate
return func(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 57, in decorate
func(project, *args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 137, in elt
run_async(_run_job(project, job, session, context_builder, force=force))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py", line 37, in run_async
future = asyncio.ensure_future(coro)
task: <Task pending name='Task-1' coro=<_run_job() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py:214> wait_for=<Task pending name='Task-4' coro=<Out._read_from_fd() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001A9530B7CA0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:422> cb=[<TaskWakeupMethWrapper object at 0x000001A953330040>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:148> created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py:37>
--- Logging error ---
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\logging\__init__.py", line 1084, in emit
stream.write(msg + self.terminator)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 65, in write
self.__out.writeline(line)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 174, in writeline
click.echo(self.prefix + line, nl=False, file=self)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\utils.py", line 272, in echo
file.write(message)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\_compat.py", line 710, in _safe_write
return _write(s)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 41, in write
self.__convertor.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 164, in write
self.wrapped.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 179, in write
self.file.write(remove_ansi_escape_sequences(data))
ValueError: I/O operation on closed file.
Call stack:
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1733, in call_exception_handler
self.default_exception_handler(context)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1707, in default_exception_handler
logger.error('\n'.join(log_lines), exc_info=exc_info)
Message: 'Task was destroyed but it is pending!\nsource_traceback: Object created at (most recent call last):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 194, in _run_module_as_main\n return _run_code(code, main_globals, None,\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 87, in _run_code\n exec(code, run_globals)\n File "C:\\Users\\derek\\AppData\\Local\\Programs\\Python\\Python38\\Scripts\\meltano.exe\\__main__.py", line 7, in <module>\n sys.exit(main())\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\__init__.py", line 43, in main\n cli(obj={"project": None})\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 782, in main\n rv = self.invoke(ctx)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1259, in invoke\n return _process_result(sub_ctx.command.invoke(sub_ctx))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 23, in decorate\n return func(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 57, in decorate\n func(project, *args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 137, in elt\n run_async(_run_job(project, job, session, context_builder, force=force))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\utils\\__init__.py", line 37, in run_async\n future = asyncio.ensure_future(coro)\ntask: <Task pending name=\'Task-1\' coro=<_run_job() running at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py:214> wait_for=<Task pending name=\'Task-4\' coro=<Out._read_from_fd() running at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001A9530B7CA0>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py:422> cb=[<TaskWakeupMethWrapper object at 0x000001A953330040>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:148> created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\utils\\__init__.py:37>'
Arguments: ()
Exception ignored in: <generator object Job._handling_sigterm at 0x000001A953096AC0>
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\job\job.py", line 229, in _handling_sigterm
signal.signal(signal.SIGTERM, original_termination_handler)
File "c:\users\derek\appdata\local\programs\python\python38\lib\signal.py", line 47, in signal
handler = _signal.signal(_enum_to_int(signalnum), _enum_to_int(handler))
TypeError: signal handler must be signal.SIG_IGN, signal.SIG_DFL, or a callable object
meltano | ERROR Task was destroyed but it is pending!
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\derek\AppData\Local\Programs\Python\Python38\Scripts\meltano.exe\__main__.py", line 7, in <module>
sys.exit(main())
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\__init__.py", line 43, in main
cli(obj={"project": None})
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 23, in decorate
return func(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 57, in decorate
func(project, *args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 137, in elt
run_async(_run_job(project, job, session, context_builder, force=force))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py", line 39, in run_async
loop.run_until_complete(future)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 603, in run_until_complete
self.run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 316, in run_forever
super().run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 570, in run_forever
self._run_once()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1851, in _run_once
handle._run()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 214, in _run_job
await _run_elt(project, context_builder, output_logger)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 232, in _run_elt
async with _redirect_output(output_logger):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 223, in _redirect_output
async with meltano_stdout.redirect_stdout(), meltano_stderr.redirect_stderr():
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 161, in redirect_stdout
async with self.writer() as stdout:
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 148, in writer
reader = asyncio.ensure_future(self._read_from_fd(read_fd))
task: <Task pending name='Task-3' coro=<Out._read_from_fd() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001A9531164F0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:422> created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:148>
--- Logging error ---
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\logging\__init__.py", line 1084, in emit
stream.write(msg + self.terminator)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 65, in write
self.__out.writeline(line)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 174, in writeline
click.echo(self.prefix + line, nl=False, file=self)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\utils.py", line 272, in echo
file.write(message)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\_compat.py", line 710, in _safe_write
return _write(s)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 41, in write
self.__convertor.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 164, in write
self.wrapped.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 179, in write
self.file.write(remove_ansi_escape_sequences(data))
ValueError: I/O operation on closed file.
Call stack:
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1733, in call_exception_handler
self.default_exception_handler(context)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1707, in default_exception_handler
logger.error('\n'.join(log_lines), exc_info=exc_info)
Message: 'Task was destroyed but it is pending!\nsource_traceback: Object created at (most recent call last):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 194, in _run_module_as_main\n return _run_code(code, main_globals, None,\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 87, in _run_code\n exec(code, run_globals)\n File "C:\\Users\\derek\\AppData\\Local\\Programs\\Python\\Python38\\Scripts\\meltano.exe\\__main__.py", line 7, in <module>\n sys.exit(main())\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\__init__.py", line 43, in main\n cli(obj={"project": None})\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 782, in main\n rv = self.invoke(ctx)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1259, in invoke\n return _process_result(sub_ctx.command.invoke(sub_ctx))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 23, in decorate\n return func(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 57, in decorate\n func(project, *args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 137, in elt\n run_async(_run_job(project, job, session, context_builder, force=force))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\utils\\__init__.py", line 39, in run_async\n loop.run_until_complete(future)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 603, in run_until_complete\n self.run_forever()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\windows_events.py", line 316, in run_forever\n super().run_forever()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 570, in run_forever\n self._run_once()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 1851, in _run_once\n handle._run()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\events.py", line 81, in _run\n self._context.run(self._callback, *self._args)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 214, in _run_job\n await _run_elt(project, context_builder, output_logger)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 232, in _run_elt\n async with _redirect_output(output_logger):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 223, in _redirect_output\n async with meltano_stdout.redirect_stdout(), meltano_stderr.redirect_stderr():\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py", line 161, in redirect_stdout\n async with self.writer() as stdout:\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py", line 148, in writer\n reader = asyncio.ensure_future(self._read_from_fd(read_fd))\ntask: <Task pending name=\'Task-3\' coro=<Out._read_from_fd() running at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001A9531164F0>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py:422> created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:148>'
Arguments: ()
meltano | ERROR Task was destroyed but it is pending!
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\derek\AppData\Local\Programs\Python\Python38\Scripts\meltano.exe\__main__.py", line 7, in <module>
sys.exit(main())
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\__init__.py", line 43, in main
cli(obj={"project": None})
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 23, in decorate
return func(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 57, in decorate
func(project, *args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 137, in elt
run_async(_run_job(project, job, session, context_builder, force=force))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py", line 39, in run_async
loop.run_until_complete(future)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 603, in run_until_complete
self.run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 316, in run_forever
super().run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 570, in run_forever
self._run_once()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1851, in _run_once
handle._run()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 214, in _run_job
await _run_elt(project, context_builder, output_logger)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 232, in _run_elt
async with _redirect_output(output_logger):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 223, in _redirect_output
async with meltano_stdout.redirect_stdout(), meltano_stderr.redirect_stderr():
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 167, in redirect_stderr
async with self.writer() as stderr:
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 148, in writer
reader = asyncio.ensure_future(self._read_from_fd(read_fd))
task: <Task pending name='Task-4' coro=<Out._read_from_fd() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001A9530B7CA0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:422> cb=[<TaskWakeupMethWrapper object at 0x000001A953330040>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:148>
--- Logging error ---
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\logging\__init__.py", line 1084, in emit
stream.write(msg + self.terminator)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 65, in write
self.__out.writeline(line)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 174, in writeline
click.echo(self.prefix + line, nl=False, file=self)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\utils.py", line 272, in echo
file.write(message)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\_compat.py", line 710, in _safe_write
return _write(s)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 41, in write
self.__convertor.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 164, in write
self.wrapped.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 179, in write
self.file.write(remove_ansi_escape_sequences(data))
ValueError: I/O operation on closed file.
Call stack:
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1733, in call_exception_handler
self.default_exception_handler(context)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1707, in default_exception_handler
logger.error('\n'.join(log_lines), exc_info=exc_info)
Message: 'Task was destroyed but it is pending!\nsource_traceback: Object created at (most recent call last):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 194, in _run_module_as_main\n return _run_code(code, main_globals, None,\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 87, in _run_code\n exec(code, run_globals)\n File "C:\\Users\\derek\\AppData\\Local\\Programs\\Python\\Python38\\Scripts\\meltano.exe\\__main__.py", line 7, in <module>\n sys.exit(main())\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\__init__.py", line 43, in main\n cli(obj={"project": None})\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 782, in main\n rv = self.invoke(ctx)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1259, in invoke\n return _process_result(sub_ctx.command.invoke(sub_ctx))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 23, in decorate\n return func(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 57, in decorate\n func(project, *args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 137, in elt\n run_async(_run_job(project, job, session, context_builder, force=force))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\utils\\__init__.py", line 39, in run_async\n loop.run_until_complete(future)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 603, in run_until_complete\n self.run_forever()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\windows_events.py", line 316, in run_forever\n super().run_forever()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 570, in run_forever\n self._run_once()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 1851, in _run_once\n handle._run()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\events.py", line 81, in _run\n self._context.run(self._callback, *self._args)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 214, in _run_job\n await _run_elt(project, context_builder, output_logger)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 232, in _run_elt\n async with _redirect_output(output_logger):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 223, in _redirect_output\n async with meltano_stdout.redirect_stdout(), meltano_stderr.redirect_stderr():\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py", line 167, in redirect_stderr\n async with self.writer() as stderr:\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py", line 148, in writer\n reader = asyncio.ensure_future(self._read_from_fd(read_fd))\ntask: <Task pending name=\'Task-4\' coro=<Out._read_from_fd() running at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001A9530B7CA0>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py:422> cb=[<TaskWakeupMethWrapper object at 0x000001A953330040>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:148>'
Arguments: ()
Exception ignored in: <coroutine object _AsyncGeneratorContextManager.__aexit__ at 0x000001A953180AC0>
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 84, in __aexit__
raise
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 14, in __aexit__
await self._aiter.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object _AsyncGeneratorContextManager.__aexit__ at 0x000001A953180DC0>
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 84, in __aexit__
raise
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 14, in __aexit__
await self._aiter.aclose()
RuntimeError: aclose(): asynchronous generator is already running
```
**Output of the same command with a problem reading my catalog file** (ignore the catalog errors, and you'll see the asyncio issues). When there's an issue in the tap asyncio tends to blow up on you with the waiter issue being brought up. Maybe there's something to do with the handling of a tap failure?
```
PS C:\code\meltano-windowstest2> meltano --log-level=debug elt tap-oracle target-csv
[2021-05-05 14:04:38,856] [21520|MainThread|root] [DEBUG] Creating engine <meltano.core.project.Project object at 0x000001F96C087970>@sqlite:///C:\code\meltano-windowstest2/.meltano/meltano.db
[2021-05-05 14:04:38,888] [21520|MainThread|asyncio] [DEBUG] Using proactor: IocpProactor
meltano | DEBUG Starting new HTTPS connection (1): www.meltano.com:443
meltano | DEBUG https://www.meltano.com:443 "GET /discovery.yml?project_id=1281de93-235e-4185-85ce-2b3555a991ae HTTP/1.1" 200 103053
meltano | DEBUG Variable '$MELTANO_LOAD_SCHEMA' is missing from the environment.
meltano | INFO Running extract & load...
meltano | DEBUG Created configuration at C:\code\meltano-windowstest2\.meltano\run\elt\2021-05-05T180438--tap-oracle--target-csv\8b145347-43c8-402e-ac5f-1e297b0e2f0a\tap.config.json
meltano | DEBUG Could not find tap.properties.json in C:\code\meltano-windowstest2\.meltano\extractors\tap-oracle\tap.properties.json, skipping.
meltano | DEBUG Could not find tap.properties.cache_key in C:\code\meltano-windowstest2\.meltano\extractors\tap-oracle\tap.properties.cache_key, skipping.
meltano | DEBUG Could not find state.json in C:\code\meltano-windowstest2\.meltano\extractors\tap-oracle\state.json, skipping.
meltano | DEBUG Variable '$MELTANO_LOAD_SCHEMA' is missing from the environment.
meltano | DEBUG Variable '$MELTANO_LOAD_SCHEMA' is missing from the environment.
meltano | DEBUG Created configuration at C:\code\meltano-windowstest2\.meltano\run\elt\2021-05-05T180438--tap-oracle--target-csv\8b145347-43c8-402e-ac5f-1e297b0e2f0a\target.config.json
meltano | WARNING No state was found, complete import.
meltano | INFO Found catalog in C:\code\meltano-windowstest2\powerschoolcatalog.json
meltano | DEBUG Deleted configuration at C:\code\meltano-windowstest2\.meltano\run\elt\2021-05-05T180438--tap-oracle--target-csv\8b145347-43c8-402e-ac5f-1e297b0e2f0a\target.config.json
meltano | DEBUG Deleted configuration at C:\code\meltano-windowstest2\.meltano\run\elt\2021-05-05T180438--tap-oracle--target-csv\8b145347-43c8-402e-ac5f-1e297b0e2f0a\tap.config.json
meltano | DEBUG ELT could not be completed: Cannot start extractor: Catalog discovery failed: invalid catalog: Expecting value: line 1 column 1 (char 0)
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\plugin\singer\tap.py", line 263, in discover_catalog
catalog = json.load(catalog_file)
File "c:\users\derek\appdata\local\programs\python\python38\lib\json\__init__.py", line 293, in load
return loads(fp.read(),
File "c:\users\derek\appdata\local\programs\python\python38\lib\json\__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "c:\users\derek\appdata\local\programs\python\python38\lib\json\decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "c:\users\derek\appdata\local\programs\python\python38\lib\json\decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\runner\singer.py", line 95, in invoke
p_tap = await tap.invoke_async(
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\plugin_invoker.py", line 231, in invoke_async
with self._invoke(*args, **kwargs) as (popen_args, popen_options, popen_env):
File "c:\users\derek\appdata\local\programs\python\python38\lib\contextlib.py", line 113, in __enter__
return next(self.gen)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\plugin_invoker.py", line 212, in _invoke
with self.plugin.trigger_hooks("invoke", self, args):
File "c:\users\derek\appdata\local\programs\python\python38\lib\contextlib.py", line 113, in __enter__
return next(self.gen)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\behavior\hookable.py", line 70, in trigger_hooks
self.__class__.trigger(self, f"before_{hook_name}", *args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\behavior\hookable.py", line 97, in trigger
raise err
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\behavior\hookable.py", line 89, in trigger
hook_func(target, *args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\plugin\singer\tap.py", line 217, in discover_catalog_hook
self.discover_catalog(plugin_invoker, exec_args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\plugin\singer\tap.py", line 267, in discover_catalog
raise PluginExecutionError(
meltano.core.plugin.error.PluginExecutionError: Catalog discovery failed: invalid catalog: Expecting value: line 1 column 1 (char 0)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 237, in _run_elt
await _run_extract_load(elt_context, output_logger)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 275, in _run_extract_load
await singer_runner.run(
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\runner\singer.py", line 257, in run
await self.invoke(
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\runner\singer.py", line 101, in invoke
raise RunnerError(f"Cannot start extractor: {err}") from err
meltano.core.runner.RunnerError: Cannot start extractor: Catalog discovery failed: invalid catalog: Expecting value: line 1 column 1 (char 0)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 225, in _redirect_output
yield
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 246, in _run_elt
raise CliError(f"ELT could not be completed: {err}") from err
meltano.cli.utils.CliError: ELT could not be completed: Cannot start extractor: Catalog discovery failed: invalid catalog: Expecting value: line 1 column 1 (char 0)
meltano | WARNING Executing <Task pending name='Task-1' coro=<_run_job() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py:214> wait_for=<Task pending name='Task-4' coro=<Out._read_from_fd() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:192> cb=[<TaskWakeupMethWrapper object at 0x000001F972E2D9A0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:148> cb=[_run_until_complete_cb() at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:184] created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py:37> took 1.000 seconds
meltano | DEBUG <_ProactorReadPipeTransport fd=4>: Fatal read error on pipe transport
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 299, in _loop_reading
self._read_fut = self._loop._proactor.recv(self._sock, 32768)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 445, in recv
self._register_with_iocp(conn)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 718, in _register_with_iocp
_overlapped.CreateIoCompletionPort(obj.fileno(), self._iocp, 0, 0)
OSError: [WinError 87] The parameter is incorrect
meltano | ERROR Exception in callback _ProactorReadPipeTransport._loop_reading()
handle: <Handle _ProactorReadPipeTransport._loop_reading() created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py:187>
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 316, in run_forever
super().run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 570, in run_forever
self._run_once()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1851, in _run_once
handle._run()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 201, in _read_from_fd
read_transport, _ = await loop.connect_read_pipe(
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1525, in connect_read_pipe
transport = self._make_read_pipe_transport(pipe, protocol, waiter)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 664, in _make_read_pipe_transport
return _ProactorReadPipeTransport(self, sock, protocol, waiter, extra)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 187, in __init__
self._loop.call_soon(self._loop_reading)
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 299, in _loop_reading
self._read_fut = self._loop._proactor.recv(self._sock, 32768)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 445, in recv
self._register_with_iocp(conn)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 718, in _register_with_iocp
_overlapped.CreateIoCompletionPort(obj.fileno(), self._iocp, 0, 0)
OSError: [WinError 87] The parameter is incorrect
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 309, in _loop_reading
self._fatal_error(exc, 'Fatal read error on pipe transport')
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 131, in _fatal_error
self._force_close(exc)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 134, in _force_close
if self._empty_waiter is not None and not self._empty_waiter.done():
AttributeError: '_ProactorReadPipeTransport' object has no attribute '_empty_waiter'
meltano | DEBUG <_ProactorReadPipeTransport fd=6>: Fatal read error on pipe transport
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 299, in _loop_reading
self._read_fut = self._loop._proactor.recv(self._sock, 32768)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 445, in recv
self._register_with_iocp(conn)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 718, in _register_with_iocp
_overlapped.CreateIoCompletionPort(obj.fileno(), self._iocp, 0, 0)
OSError: [WinError 87] The parameter is incorrect
meltano | ERROR Exception in callback _ProactorReadPipeTransport._loop_reading()
handle: <Handle _ProactorReadPipeTransport._loop_reading() created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py:187>
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 316, in run_forever
super().run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 570, in run_forever
self._run_once()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1851, in _run_once
handle._run()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 201, in _read_from_fd
read_transport, _ = await loop.connect_read_pipe(
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1525, in connect_read_pipe
transport = self._make_read_pipe_transport(pipe, protocol, waiter)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 664, in _make_read_pipe_transport
return _ProactorReadPipeTransport(self, sock, protocol, waiter, extra)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 187, in __init__
self._loop.call_soon(self._loop_reading)
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 299, in _loop_reading
self._read_fut = self._loop._proactor.recv(self._sock, 32768)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 445, in recv
self._register_with_iocp(conn)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 718, in _register_with_iocp
_overlapped.CreateIoCompletionPort(obj.fileno(), self._iocp, 0, 0)
OSError: [WinError 87] The parameter is incorrect
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 309, in _loop_reading
self._fatal_error(exc, 'Fatal read error on pipe transport')
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 131, in _fatal_error
self._force_close(exc)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\proactor_events.py", line 134, in _force_close
if self._empty_waiter is not None and not self._empty_waiter.done():
AttributeError: '_ProactorReadPipeTransport' object has no attribute '_empty_waiter'
meltano | DEBUG Read pipe 4 connected: (<_ProactorReadPipeTransport fd=4>, <asyncio.streams.StreamReaderProtocol object at 0x000001F972CF7B20>)
meltano | DEBUG Read pipe 6 connected: (<_ProactorReadPipeTransport fd=6>, <asyncio.streams.StreamReaderProtocol object at 0x000001F972CF7CD0>)
Aborted!
Exception ignored in: <coroutine object _run_job at 0x000001F972C520C0>
RuntimeError: coroutine ignored GeneratorExit
meltano | ERROR Task was destroyed but it is pending!
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\derek\AppData\Local\Programs\Python\Python38\Scripts\meltano.exe\__main__.py", line 7, in <module>
sys.exit(main())
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\__init__.py", line 43, in main
cli(obj={"project": None})
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 23, in decorate
return func(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 57, in decorate
func(project, *args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 137, in elt
run_async(_run_job(project, job, session, context_builder, force=force))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py", line 37, in run_async
future = asyncio.ensure_future(coro)
task: <Task pending name='Task-1' coro=<_run_job() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py:214> wait_for=<Task pending name='Task-4' coro=<Out._read_from_fd() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001F972CF7CA0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:422> cb=[<TaskWakeupMethWrapper object at 0x000001F972E2D9A0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:148> created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py:37>
--- Logging error ---
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\logging\__init__.py", line 1084, in emit
stream.write(msg + self.terminator)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 65, in write
self.__out.writeline(line)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 174, in writeline
click.echo(self.prefix + line, nl=False, file=self)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\utils.py", line 272, in echo
file.write(message)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\_compat.py", line 710, in _safe_write
return _write(s)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 41, in write
self.__convertor.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 164, in write
self.wrapped.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 179, in write
self.file.write(remove_ansi_escape_sequences(data))
ValueError: I/O operation on closed file.
Call stack:
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1733, in call_exception_handler
self.default_exception_handler(context)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1707, in default_exception_handler
logger.error('\n'.join(log_lines), exc_info=exc_info)
Message: 'Task was destroyed but it is pending!\nsource_traceback: Object created at (most recent call last):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 194, in _run_module_as_main\n return _run_code(code, main_globals, None,\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 87, in _run_code\n exec(code, run_globals)\n File "C:\\Users\\derek\\AppData\\Local\\Programs\\Python\\Python38\\Scripts\\meltano.exe\\__main__.py", line 7, in <module>\n sys.exit(main())\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\__init__.py", line 43, in main\n cli(obj={"project": None})\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 782, in main\n rv = self.invoke(ctx)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1259, in invoke\n return _process_result(sub_ctx.command.invoke(sub_ctx))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 23, in decorate\n return func(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 57, in decorate\n func(project, *args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 137, in elt\n run_async(_run_job(project, job, session, context_builder, force=force))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\utils\\__init__.py", line 37, in run_async\n future = asyncio.ensure_future(coro)\ntask: <Task pending name=\'Task-1\' coro=<_run_job() running at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py:214> wait_for=<Task pending name=\'Task-4\' coro=<Out._read_from_fd() running at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001F972CF7CA0>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py:422> cb=[<TaskWakeupMethWrapper object at 0x000001F972E2D9A0>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:148> created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\utils\\__init__.py:37>'
Arguments: ()
Exception ignored in: <generator object Job._handling_sigterm at 0x000001F972CD6AC0>
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\job\job.py", line 229, in _handling_sigterm
signal.signal(signal.SIGTERM, original_termination_handler)
File "c:\users\derek\appdata\local\programs\python\python38\lib\signal.py", line 47, in signal
handler = _signal.signal(_enum_to_int(signalnum), _enum_to_int(handler))
TypeError: signal handler must be signal.SIG_IGN, signal.SIG_DFL, or a callable object
meltano | ERROR Task was destroyed but it is pending!
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\derek\AppData\Local\Programs\Python\Python38\Scripts\meltano.exe\__main__.py", line 7, in <module>
sys.exit(main())
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\__init__.py", line 43, in main
cli(obj={"project": None})
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 23, in decorate
return func(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 57, in decorate
func(project, *args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 137, in elt
run_async(_run_job(project, job, session, context_builder, force=force))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py", line 39, in run_async
loop.run_until_complete(future)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 603, in run_until_complete
self.run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 316, in run_forever
super().run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 570, in run_forever
self._run_once()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1851, in _run_once
handle._run()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 214, in _run_job
await _run_elt(project, context_builder, output_logger)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 232, in _run_elt
async with _redirect_output(output_logger):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 223, in _redirect_output
async with meltano_stdout.redirect_stdout(), meltano_stderr.redirect_stderr():
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 161, in redirect_stdout
async with self.writer() as stdout:
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 148, in writer
reader = asyncio.ensure_future(self._read_from_fd(read_fd))
task: <Task pending name='Task-3' coro=<Out._read_from_fd() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001F972CF7AF0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:422> created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:148>
--- Logging error ---
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\logging\__init__.py", line 1084, in emit
stream.write(msg + self.terminator)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 65, in write
self.__out.writeline(line)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 174, in writeline
click.echo(self.prefix + line, nl=False, file=self)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\utils.py", line 272, in echo
file.write(message)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\_compat.py", line 710, in _safe_write
return _write(s)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 41, in write
self.__convertor.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 164, in write
self.wrapped.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 179, in write
self.file.write(remove_ansi_escape_sequences(data))
ValueError: I/O operation on closed file.
Call stack:
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1733, in call_exception_handler
self.default_exception_handler(context)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1707, in default_exception_handler
logger.error('\n'.join(log_lines), exc_info=exc_info)
Message: 'Task was destroyed but it is pending!\nsource_traceback: Object created at (most recent call last):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 194, in _run_module_as_main\n return _run_code(code, main_globals, None,\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 87, in _run_code\n exec(code, run_globals)\n File "C:\\Users\\derek\\AppData\\Local\\Programs\\Python\\Python38\\Scripts\\meltano.exe\\__main__.py", line 7, in <module>\n sys.exit(main())\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\__init__.py", line 43, in main\n cli(obj={"project": None})\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 782, in main\n rv = self.invoke(ctx)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1259, in invoke\n return _process_result(sub_ctx.command.invoke(sub_ctx))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 23, in decorate\n return func(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 57, in decorate\n func(project, *args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 137, in elt\n run_async(_run_job(project, job, session, context_builder, force=force))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\utils\\__init__.py", line 39, in run_async\n loop.run_until_complete(future)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 603, in run_until_complete\n self.run_forever()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\windows_events.py", line 316, in run_forever\n super().run_forever()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 570, in run_forever\n self._run_once()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 1851, in _run_once\n handle._run()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\events.py", line 81, in _run\n self._context.run(self._callback, *self._args)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 214, in _run_job\n await _run_elt(project, context_builder, output_logger)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 232, in _run_elt\n async with _redirect_output(output_logger):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 223, in _redirect_output\n async with meltano_stdout.redirect_stdout(), meltano_stderr.redirect_stderr():\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py", line 161, in redirect_stdout\n async with self.writer() as stdout:\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py", line 148, in writer\n reader = asyncio.ensure_future(self._read_from_fd(read_fd))\ntask: <Task pending name=\'Task-3\' coro=<Out._read_from_fd() running at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001F972CF7AF0>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py:422> created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:148>'
Arguments: ()
meltano | ERROR Task was destroyed but it is pending!
source_traceback: Object created at (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\derek\appdata\local\programs\python\python38\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\derek\AppData\Local\Programs\Python\Python38\Scripts\meltano.exe\__main__.py", line 7, in <module>
sys.exit(main())
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\__init__.py", line 43, in main
cli(obj={"project": None})
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 23, in decorate
return func(*args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\params.py", line 57, in decorate
func(project, *args, **kwargs)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 137, in elt
run_async(_run_job(project, job, session, context_builder, force=force))
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\utils\__init__.py", line 39, in run_async
loop.run_until_complete(future)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 603, in run_until_complete
self.run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\windows_events.py", line 316, in run_forever
super().run_forever()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 570, in run_forever
self._run_once()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1851, in _run_once
handle._run()
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\events.py", line 81, in _run
self._context.run(self._callback, *self._args)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 214, in _run_job
await _run_elt(project, context_builder, output_logger)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 232, in _run_elt
async with _redirect_output(output_logger):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\cli\elt.py", line 223, in _redirect_output
async with meltano_stdout.redirect_stdout(), meltano_stderr.redirect_stderr():
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 167, in redirect_stderr
async with self.writer() as stderr:
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 148, in writer
reader = asyncio.ensure_future(self._read_from_fd(read_fd))
task: <Task pending name='Task-4' coro=<Out._read_from_fd() running at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001F972CF7CA0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py:422> cb=[<TaskWakeupMethWrapper object at 0x000001F972E2D9A0>()] created at c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py:148>
--- Logging error ---
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\logging\__init__.py", line 1084, in emit
stream.write(msg + self.terminator)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 65, in write
self.__out.writeline(line)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 174, in writeline
click.echo(self.prefix + line, nl=False, file=self)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\utils.py", line 272, in echo
file.write(message)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\click\_compat.py", line 710, in _safe_write
return _write(s)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 41, in write
self.__convertor.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\colorama\ansitowin32.py", line 164, in write
self.wrapped.write(text)
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\meltano\core\logging\output_logger.py", line 179, in write
self.file.write(remove_ansi_escape_sequences(data))
ValueError: I/O operation on closed file.
Call stack:
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1733, in call_exception_handler
self.default_exception_handler(context)
File "c:\users\derek\appdata\local\programs\python\python38\lib\asyncio\base_events.py", line 1707, in default_exception_handler
logger.error('\n'.join(log_lines), exc_info=exc_info)
Message: 'Task was destroyed but it is pending!\nsource_traceback: Object created at (most recent call last):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 194, in _run_module_as_main\n return _run_code(code, main_globals, None,\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\runpy.py", line 87, in _run_code\n exec(code, run_globals)\n File "C:\\Users\\derek\\AppData\\Local\\Programs\\Python\\Python38\\Scripts\\meltano.exe\\__main__.py", line 7, in <module>\n sys.exit(main())\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\__init__.py", line 43, in main\n cli(obj={"project": None})\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 829, in __call__\n return self.main(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 782, in main\n rv = self.invoke(ctx)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1259, in invoke\n return _process_result(sub_ctx.command.invoke(sub_ctx))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 1066, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\click\\core.py", line 610, in invoke\n return callback(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 23, in decorate\n return func(*args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\params.py", line 57, in decorate\n func(project, *args, **kwargs)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 137, in elt\n run_async(_run_job(project, job, session, context_builder, force=force))\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\utils\\__init__.py", line 39, in run_async\n loop.run_until_complete(future)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 603, in run_until_complete\n self.run_forever()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\windows_events.py", line 316, in run_forever\n super().run_forever()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 570, in run_forever\n self._run_once()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py", line 1851, in _run_once\n handle._run()\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\events.py", line 81, in _run\n self._context.run(self._callback, *self._args)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 214, in _run_job\n await _run_elt(project, context_builder, output_logger)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 232, in _run_elt\n async with _redirect_output(output_logger):\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\cli\\elt.py", line 223, in _redirect_output\n async with meltano_stdout.redirect_stdout(), meltano_stderr.redirect_stderr():\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py", line 167, in redirect_stderr\n async with self.writer() as stderr:\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\async_generator\\_util.py", line 34, in __aenter__\n return await self._agen.asend(None)\n File "c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py", line 148, in writer\n reader = asyncio.ensure_future(self._read_from_fd(read_fd))\ntask: <Task pending name=\'Task-4\' coro=<Out._read_from_fd() running at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:205> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x000001F972CF7CA0>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\asyncio\\base_events.py:422> cb=[<TaskWakeupMethWrapper object at 0x000001F972E2D9A0>()] created at c:\\users\\derek\\appdata\\local\\programs\\python\\python38\\lib\\site-packages\\meltano\\core\\logging\\output_logger.py:148>'
Arguments: ()
Exception ignored in: <coroutine object _AsyncGeneratorContextManager.__aexit__ at 0x000001F972D5CF40>
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 84, in __aexit__
raise
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 14, in __aexit__
await self._aiter.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object _AsyncGeneratorContextManager.__aexit__ at 0x000001F972DCA140>
Traceback (most recent call last):
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 84, in __aexit__
raise
File "c:\users\derek\appdata\local\programs\python\python38\lib\site-packages\async_generator\_util.py", line 14, in __aexit__
await self._aiter.aclose()
RuntimeError: aclose(): asynchronous generator is already running
```
| 4 |
7,603,319 | 86,416,389 |
2021-05-04 00:07:00.991
|
Malformed schema_mapping config passed to target-snowflake
|
<!---
Please read this!
Before opening a new issue, make sure to search for keywords in the issues
filtered by the "regression" or "bug" label and verify the issue you're about to submit isn't a duplicate.
If you are submitting an issue with a tap, please include:
- account details
- target details
- entities selected with meltano select (if you have selected any entities), as the bug may be related to a specific entity
- the full elt command you are running
- full output of the meltano elt command. Logs can get pretty long, so you can add the full log as a snippet in the Meltano project and add a link in the issue.
--->
### What is the current *bug* behavior?
_What is happening now?_
I'm trying to configure `schema_mapping` for the transferwise snowflake target.
```yaml
loaders:
- name: target-snowflake
variant: transferwise
pip_url: pipelinewise-target-snowflake==1.12.0
config:
account: my-account
dbname: MY_DATABASE
user: MY_USER
warehouse: MY_WAREHOUSE
file_format: MY_SCHEMA.CSV
role: MY_ROLE
schema_mapping:
public:
target_schema: RAW_PUBLIC
```
This results in this error:
```
target-snowflake | Traceback (most recent call last):
target-snowflake | File "/Users/dean/dev/hi-meltano/.meltano/loaders/target-snowflake/venv/bin/target-snowflake", line 8, in <module>
target-snowflake | sys.exit(main())
target-snowflake | File "/Users/dean/dev/hi-meltano/.meltano/loaders/target-snowflake/venv/lib/python3.8/site-packages/target_snowflake/__init__.py", line 438, in main
target-snowflake | table_cache, file_format_type = get_snowflake_statics(config)
target-snowflake | File "/Users/dean/dev/hi-meltano/.meltano/loaders/target-snowflake/venv/lib/python3.8/site-packages/target_snowflake/__init__.py", line 79, in get_snowflake_statics
target-snowflake | table_schemas=stream_utils.get_schema_names_from_config(config))
target-snowflake | File "/Users/dean/dev/hi-meltano/.meltano/loaders/target-snowflake/venv/lib/python3.8/site-packages/target_snowflake/stream_utils.py", line 43, in get_schema_names_from_config
target-snowflake | schema_names.append(target.get('target_schema'))
target-snowflake | AttributeError: 'str' object has no attribute 'get'
```
The original yaml config gets converted to json and passed to the snowflake target as a file: https://github.com/transferwise/pipelinewise-target-snowflake/blob/master/target_snowflake/__init__.py#L429
If I edit the plugin and log the contents of this file, I get:
```json
{
"account": "my-account",
"dbname": "MY_DATABASE",
"user": "MY_USER",
"password": "hunter2",
"warehouse": "MY_WAREHOUSE",
"file_format": "MY_SCHEMA.CSV",
"default_target_schema": "TAP_POSTGRES",
"batch_size_rows": 100000,
"flush_all_streams": false,
"parallelism": 0,
"parallelism_max": 16,
"schema_mapping": {
"public.target_schema": "RAW_PUBLIC",
"public": {
"target_schema": "RAW_PUBLIC"
}
},
"disable_table_cache": false,
"add_metadata_columns": false,
"hard_delete": false,
"data_flattening_max_level": 0,
"primary_key_required": true,
"validate_records": false,
"no_compression": false,
"role": "MY_ROLE"
}
```
### What is the expected *correct* behavior?
_What should be happening?_
`schema_mapping` in the json should not include `"public.target_schema": "RAW_PUBLIC",`, it should simply be
```
"schema_mapping": {
"public": {
"target_schema": "RAW_PUBLIC"
}
},
```
### Steps to reproduce
_How one can reproduce the issue?_
python 3.8.9
meltano 1.73.0
Full `meltano.yml`:
```yaml
version: 1
send_anonymous_usage_stats: false
plugins:
extractors:
- name: tap-postgres
variant: transferwise
pip_url: pipelinewise-tap-postgres==1.7.1
config:
user: postgres
dbname: my-db
filter_schemas: public
default_replication_method: LOG_BASED
logical_poll_total_seconds: 5
select_filter:
- public-my_table
metadata:
public-my_table.*:
replication-method: FULL_TABLE
schema:
public-my_table:
export_file_exchange_id:
type:
- string
- 'null'
format: date-time
loaders:
- name: target-snowflake
variant: transferwise
pip_url: pipelinewise-target-snowflake==1.12.0
config:
account: my-account
dbname: MY_DATABASE
user: MY_USER
warehouse: MY_WAREHOUSE
file_format: MY_SCHEMA.CSV
role: MY_ROLE
schema_mapping:
public:
target_schema: RAW_PUBLIC
```
```
$ meltano install
$ meltano elt tap-postgres target-snowflake
```
### Relevant logs and/or screenshots
_Please use code blocks (\`\`\`) to format console output_
### Possible fixes
_If you can, link to the line of code that might be responsible for the problem or suggest a fix_
### Further regression test
_Ensure we automatically catch similar issues in the future_
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 8 |
7,603,319 | 85,822,082 |
2021-04-23 13:21:29.608
|
Follow-up from "building a custom extractor" tutorial docs
|
Logging follow-ups for further improvement on the tutorial page:
The following discussions from !2107 should be addressed:
- [ ] @DouweM started a [discussion](https://gitlab.com/meltano/meltano/-/merge_requests/2107#note_558219557): (+1 comment)
> Can we refer to the documentation on where this script comes from?
- [ ] @DouweM started a [discussion](https://gitlab.com/meltano/meltano/-/merge_requests/2107#note_558219582):
> Like above, I'd rather recommend they use `meltano config <plugin> set password <value>` which will automatically store it in the most appropriate location, which is `.env` in this case.
>
> We don't need to explicitly talk about environment variables then, just like we don't in https://meltano.com/docs/getting-started.html#configure-the-extractor. They can learn more about those if they dive deeper into https://meltano.com/docs/configuration.html and find https://meltano.com/docs/configuration.html#configuring-settings.
- [ ] @DouweM started a [discussion](https://gitlab.com/meltano/meltano/-/merge_requests/2107#note_558219583):
> I think we can remove this section entirely, or in favor of a link to https://meltano.com/docs/configuration.html#configuration-layers.
>
> In general, I think this guide should not focus too much on the different ways/places config can be stored; that's something for the Configuration guide.
- [ ] @DouweM started a [discussion](https://gitlab.com/meltano/meltano/-/merge_requests/2107#note_558219587):
> Instead of just mentioning a command here in a vacuum, I think we should link to the canonical guide at https://meltano.com/docs/getting-started.html#select-entities-and-attributes-to-extract.
>
> I'm starting to think that this guide covers a lot more than it needs to; basically all of https://meltano.com/docs/getting-started.html still applies, except for some bits under https://meltano.com/docs/getting-started.html#add-an-extractor-to-pull-data-from-a-source. I think it'd be better to only cover on what's custom-extractor-specific, and link to the other guides for everything else.
- [ ] @DouweM started a [discussion](https://gitlab.com/meltano/meltano/-/merge_requests/2107#note_558219590): (+1 comment)
> I think this would make more sense as a dedicated doc in the SDK repo that we could link to from here.
| 2 |
7,603,319 | 85,584,822 |
2021-04-20 17:29:18.138
|
Add `--help `docs for meltano CLI
|
Several of the CLI commands are not documented in `meltano --help`. Adding descriptions here could help streamline the new user experience.

Also, many of the descriptions are trailing off in the screenshot above '...'.
CLI documentation for reference: https://meltano.com/docs/command-line-interface.html
Reported in slack: https://meltano.slack.com/archives/C01TCRBBJD7/p1618932986365900?thread_ts=1618925301.362900&cid=C01TCRBBJD7
## Update 2021-11-30
This is much better now but still has some missing hints.
```console
$ meltano --help
Usage: meltano [OPTIONS] COMMAND [ARGS]...
Get help at https://www.meltano.com/docs/command-line-interface.html
Options:
--log-level [debug|info|warning|error|critical]
-v, --verbose
--environment TEXT Meltano environment name
--version Show the version and exit.
--help Show this message and exit.
Commands:
add Add a plugin to your project.
config Display Meltano or plugin configuration.
discover
elt meltano elt EXTRACTOR_NAME LOADER_NAME extractor_name: Which...
environment Manage Environments.
init Creates a new Meltano project
install Installs all the dependencies of your project based on the...
invoke Invoke the plugin's executable with specified arguments.
remove Remove a plugin from your project.
schedule Manage Schedules.
schema
select Execute the meltano select command.
ui
upgrade
user
```
| 2 |
7,603,319 | 85,329,762 |
2021-04-16 17:05:33.133
|
Office Hours on Meltano website should link to #office-hours
|
https://meltano.com/docs/community.html#office-hours
:point_up: This should also reference our slack link for #office-hours: https://meltano.slack.com/archives/C01QS0RV78D
| 1 |
7,603,319 | 85,056,180 |
2021-04-12 15:55:29.044
|
Fail when plugin `pip_url` changed and installation is outdated (pip_url drift)
|
### Problem to solve
I installed Meltano with tap-mysql and target-postgres. After creating some pipelines I realised that for a data set I needed to track deletions and this would require the transferwise variant. I switched the variant in the meltano.yml, but then spent a whilst trying to work out why it was not creating _sdc_deleted_at columns in postgres.
This is partially as I'm new, however the solution was to reinstall target-postgres with the transferwise variant. Everything then just worked, as you'd expect!
### Proposal
It would really help (newer) users if the configuration validated the variant installed when it tried to run them.
### What does success look like, and how can we measure that?
If the installed variant of a tap or a target does not match the variant in the meltano.yml then an error or warning should be shown.
### Proposed spec (added 2022-03-03)
1. During install we drop a marker file into the root of the venv with text of the `pip_url` used during installation. (Or a hash, if we think the pip_url is sensitive.)
- E.g. `.meltano/.../dbt/meltano_pip_url_marker.txt`
2. During execution we check the marker file against the current `pip_url` value.
3. If the comparison fails, we either warn or throw a hard failure before executing the plugin.
| 4 |
7,603,319 | 82,213,690 |
2021-04-02 22:24:10.454
|
update default dbt version to 0.18.2
|
This issue is being created to document an MR already in progress.
At this moment, meltano uses `dbt==0.16.1` for the dbt transformer. There appear to be some installation issues with pinned version of dbt related to `agate` and `PyICU` dependencies.
dbt recently released https://github.com/fishtown-analytics/dbt/releases/tag/v0.18.2, which pins `agate<1.6.2` to avoid installation errors relating to its new dependency `PyICU`.
Testing was successful installing `dbt==0.18.2` from meltano:
```
transformers:
- name: dbt
# pip_url: dbt==0.16.1
pip_url: dbt==0.18.2
```
Note there are significant differences in parsing dbt_project files between versions. Migration guide here: https://docs.getdbt.com/docs/guides/migration-guide/upgrading-to-0-17-0
| 12 |
7,603,319 | 81,869,676 |
2021-03-29 19:00:20.927
|
Include discovery output in `meltano elt` output
|
### What is the current *bug* behavior?
When running `meltano --log-level=debug elt tap-spreadsheets-anywhere target-postgres --job_id=test`, with the tap trying to load an unsupported JSON file, error messages from the tap's discover code are not shown in the output.
After checking with @DouweM, it is apparently related to the fact that meltano elt runs the tap twice, once with --discover to generate the catalog, and then in sync mode along with the target. meltano elt only forwards stdout and stderr from that sync mode run, not the discovery run, meaning that any log messages that only show up during discovery mode are lost, unless discovery mode fails entirely and all output is dumped.
### What is the expected *correct* behavior?
Log messages from the discovery step should be forwarded to the output as well.
### Steps to reproduce
- setup `tap-spreadsheets-anywhere` to load a file incorrectly formatted, so that discovery will fail with an error message. The following config for the tap can be used to trigger such an error:
```yaml
plugins:
extractors:
- name: tap-spreadsheets-anywhere
pip_url: git+https://github.com/ets/tap-spreadsheets-anywhere.git
config:
tables:
- path: https://salsa.debian.org/iso-codes-team/iso-codes/-/raw/main/data/
format: json
name: iso_countries
pattern: "iso_3166-1.*"
key_properties:
- 'alpha_2'
start_date: '2011-01-01T00:00:00Z'
```
- run `meltano --log-level=debug elt tap-spreadsheets-anywhere target-postgres --job_id=test`
- note that the output does not include the expected error messages (`ERROR Unable to write Catalog entry for 'iso_countries' - it will be skipped due to error 'str' object has no attribute 'items'`) should appear in the logs
### Relevant logs and/or screenshots
### Possible fixes
### Further regression test
_Ensure we automatically catch similar issues in the future_
- [ ] Write additional adequate test cases and submit test results
- [ ] Test results should be reviewed by a person from the team
| 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.