Drupal installation profiles are usually just starting points for a new site, but sometimes you want to leave an installation profile untouched, adding your own additions in a separate space and leaving a “clean” upgrade path for future versions of the profile. Here’s a handy script I use for such situations.
Let’s say you’re building a site on top of an installation profile such as Drupal Commons, and you want to treat Commons like you treat Drupal Core (that is, leaving it untouched). What I did was create a module which gets enabled after Commons is installed, at which point all of the customizations unique to this site are made.
Anyone who works with installation profiles recognizes the need to quickly wipe your database and start the installation process over again, to make sure that everything installs cleanly. However, in this scenario, there’s no need to keep re-installing the profile, since it’s not changing.
The following script will check for a database snapshot of the installed profile (in this case, “commons”), and if it’s absent, it will use drush to install the site and save a backup of the database. Then it will enable my custom module. On subsequent runs, since it will find the database snapshot, it won’t re-install the profile, but instead it will just dump the database and import the snapshot, and then enable the custom module.
#!/bin/bash cd drupal if [[ ! -f ../snapshot.sql ]]; then echo "Installing profile." drush site-install -y -v 'commons' \ --site-name='My Drupal Site' \ --firstname.lastname@example.org \ --account-name=admin \ --account-pass=admin \ --email@example.com \ install_configure_form.site_default_country=US echo "Saving database snapshot." drush sql-dump -v --result-file=../snapshot.sql else echo "Dropping database." drush sql-drop -y -v echo "Importing database snapshot." drush sql-cli < ../snapshot.sql fi echo "Enabling custom module." drush en -y -v custom_module cd ..
An idea to further refine this: after the entire process is finished, import the database snapshot into a temporary database. Then, the next time the script is run, drop the old tables and rename the temp tables so they get placed in the actual database. Might make it a little faster.