Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rollback and reapply norms in development? #37

Open
danielcompton opened this issue Jul 11, 2017 · 6 comments
Open

Rollback and reapply norms in development? #37

danielcompton opened this issue Jul 11, 2017 · 6 comments

Comments

@danielcompton
Copy link

Duct's migrator.ragtime has a really neat feature where it hashes the migration, and if the hash of the migration changes, it rolls back the migration, then reapplies it. This is really useful in development scenarios, where you want to be able to experiment with different schemas, or incrementally modify an existing one during development.

Would something like this make sense for Conformity?

@avescodes
Copy link
Owner

avescodes commented Jul 11, 2017 via email

@danielcompton
Copy link
Author

danielcompton commented Jul 17, 2017

I've been thinking about this a little bit, here's my current thoughts.

Background

In development, it is often desirable to work with a "development" database, and rework its schema multiple times before committing to a particular approach and deploying to production. Often the development database has useful working state in it which is inconvenient or slow to recreate.

Datomic doesn't allow schema to be retracted or excised, so it is not possible to completely roll-back a migration in the same way that you could in a traditional SQL database.

Idea

When conforming schemas, Conformity can (optionally) check whether the norms that were transacted still match what they say now. If there is a difference, then the user can use multiple strategies to bring the db into alignment. This is intended only for use in development, schema migration in production is a separate issue.

Schema strategy:

  • No check
  • Warn - Print a warning to the console (probably best for production)
  • Rename - rename all conformed attributes that have changed to a synthetic name, and then reapply the new attributes

Data strategy:

  • Warn - warn that the data is now using renamed attributes
  • Retract - retract any data that was transacted since the modified norm was applied
  • Excise - excise any data that was transacted since the modified norm was applied

I'm not sure what should happen for data that was applied in a norm. Maybe the same as any data that was transacted since the norm?

This is a little bit hazy for me, so some of the details above may not make sense or be possible, but this is the general direction I'm thinking of.

@kennethkalmer
Copy link

My approach is to use datomock in my REPL and in my tests for the initial work of fleshing out the schema, with a help function that looks more or less like this:

(defn mock-conn! [conn]
  (def mconn (app.db.core/migrate-schema! (datomock/fork-conn conn)))
  (def mdb (d/db mconn)))

I then iterate on a mock connection in the REPL, and the tests give me some really good feedback. This also has the benefit of validating my transactions as I go.

It is not perfect (yet). I want to improve on the flow and be able to swap out the real datomic connection with some development middleware so I can actually test my full app with the mocked/conformed database. I'm relying on my tests for this feedback, and I still find myself having to add additional transactions after I've conformed and have had the tests pass.

@avescodes
Copy link
Owner

avescodes commented Jul 26, 2017 via email

@daemianmack
Copy link
Contributor

In case it's useful as a dissenting data point, I don't personally feel the anticipated benefit of this feature sufficiently offsets the increase in complexity or API surface area.

To me, it feels more like a workflow issue than a tooling one.

In the past, I've always addressed the problem of arriving at a correct norm by iterating on small, focused schema alterations via some mix of REPL testing and Datomic's mem DB before committing final PR-worthy changes to the schema description -- similar in spirit to the approach described by @kennethkalmer.

This hasn't been nearly painful enough that I've wanted extra tooling support for it.

Just wanted to throw that out there in case it helps highlight an alternative path. Happy to discuss further -- in particular, I'd be curious to see a concrete code situation that constitutes a strong pro argument for this change; perhaps I've just been lucky?

@danielcompton
Copy link
Author

danielcompton commented Jul 27, 2017

One strong use case we have is that our front-end developers run our back-end Datomic application, but don't usually work with the code much. They need a persistent database so that they can keep the state they've built up over time when developing a feature for a few days. When I push a new update which has changed the existing migration (that hasn't made it to prod yet), I'd like it to automatically migrate the data, or at least warn them that their schema isn't up-to-date.

Currently, I just tell them when to drop the entire DB, which works, but isn't the best, and sometimes I forget to tell them, leading to strange errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants