You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At present, removal of a table from the publication causes us to consider that table totally unreliable. In order to re-add the table to our stream, we'd schedule an import of all existing data, in order to ensure we haven't missed anything.
Given users should be able to remove tables without too much worry, especially if one table has changes that are incompatible with the sink, imports should attempt to be cheap. If we were to track the LSN at which we ceased to stream changes, we might be able to filter which rows are included in a subsequent import by using the xmin field on each row.
The text was updated successfully, but these errors were encountered:
At present, removal of a table from the publication causes us to consider that table totally unreliable. In order to re-add the table to our stream, we'd schedule an import of all existing data, in order to ensure we haven't missed anything.
Given users should be able to remove tables without too much worry, especially if one table has changes that are incompatible with the sink, imports should attempt to be cheap. If we were to track the LSN at which we ceased to stream changes, we might be able to filter which rows are included in a subsequent import by using the
xmin
field on each row.The text was updated successfully, but these errors were encountered: