Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Notes on the way Drupal Entities and Configuration have been utilized in boston.gov and theHub.
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Emergency Alerts Signup form and services.
Loading...
Loading...
City of Boston strives to automate the develop, test, package and deploy process at each step from local development to deployment in live production,
This page is out of date and needs review (as at 17 June 2021)
The repository is cloned in a local folder and ready for building.
This entry condition can be achieved:
If you have not yet built the boston.gov website on your local machine, or
If you have cloned a new branch or created a new branch that you wish to build, you can run the doit rebuild quick
script, or
If you have the repository cloned, but wish to delete it and rebuild a fresh website from a branch on the GitHub repository, you can run doit rebuild full <branch>.
If you don't specify a branch, then develop
will be used.
Local developer responsible for creating local development environment.
The local build process is defined and controlled by Lando when lando start
is executed.
The doit
scripts serve to prepare the cloned repository prior to running lando start
Lando
lando start
causes the following processes to be run from lando.yml
3 standard Linux (ubuntu) containers are created. One optimized as an appserver with Apache, one optimized as a database server with MySQL and one with Node.
Install the required/dependent packages and tools -including Phing and Composer.
Create and install XDebug and other Apache/PHP settings files.
Set apache vhosts and container's network configs. (done by Docker via Lando).
Start all 3 containers.
Launch the phing script setup:docker:drupal-local
.
Phing
The phing scriptsetup:docker:drupal-local
in reporoot/scripts/phing/tasks/setup.xml
executes the following:
Download Drupal dependencies into Apache appserver container - including Drush. (done using Composer).
Download confidential settings and copy into Drupal file system (using Git).
Install Drupal by Installing a new Database on the database container. (using Drush).
Install Drupal modules and load configuration files. (using Drush).
Run Drupal's Update process to load updated-settings from modules. (using Drush).
Modify Drupal settings with localized settings.
Reset the admin password and issue login url. (using Drush).
Run Linting Test using PHP Linting. (done by PHP via Phing, launched by Travis).
Run Code Sniffer Test. (done by Sqizzlabs via Phing, launched by Travis).
(coming soon) Run Behat behavioral tests. (done by Behat via Phing, launched by Travis).
(coming soon) Run PHPUnit functional tests. (done by PHPUnit via Phing, launched by Travis).
For local development, the docker container build is controlled by Lando, with Phing being used to build Drupal.
When a Pull Request is created to merge code into the develop branch on Github a test build and some automated testing is run by Travis. Travis is used in place of Lando to initiate and control the build process as described above (i.e. Travis is used to build docker containers on Github/Travis infrastructure, whereas Lando builds docker containers on local machines). In both cases the Travis and Lando scripts are very similar in structure and identical (as possible) in function. Once the containers a built, both tools use the same Phing scripts to build and initiate Drupal.
(coming soon) Terraform will be used to spin up on-demand test/develop/experiment/demo instances of the containers (i.e. the websites) on AWS infrastructure. In this case Terraform scripts will be used to control the build in place of Lando - but (as with Travis) will be as similar as possible in function. Again, once the containers a built on AWS, the same Phing scripts will be used to build Drupal.
Run Linting Test using PHP Linting. (done by PHP via Phing, launched by Travis).
Run Code Sniffer Test. (done by Sqizzlabs via Phing, launched by Travis).
(coming soon) Run Behat behavioral tests. (done by Behat via Phing, launched by Travis).
(coming soon) Run PHPUnit functional tests. (done by PHPUnit via Phing, launched by Travis).
For local development, the docker container build is controlled by Lando, with Phing being used to build Drupal.
When a Pull Request is created to merge code into the develop branch on Github a test build and some automated testing is run by Travis. Travis is used in place of Lando to initiate and control the build process as described above (i.e. Travis is used to build docker containers on Github/Travis infrastructure, whereas Lando builds docker containers on local machines). In both cases the Travis and Lando scripts are very similar in structure and identical (as possible) in function. Once the containers a built, both tools use the same Phing scripts to build and initiate Drupal.
(coming soon) Terraform will be used to spin up on-demand test/develop/experiment/demo instances of the containers (i.e. the websites) on AWS infrastructure. In this case Terraform scripts will be used to control the build in place of Lando - but (as with Travis) will be as similar as possible in function. Again, once the containers a built on AWS, the same Phing scripts will be used to build Drupal.
Set up environment for Drupal development on various operating systems.
Select your operating system from below, and follow the instructions to setup your development environment and prepare to install the City of Boston Drupal 8 website.
Tip
You can (re)use an existing key on your development computer, so long as it meets the requirements of GitHub.
How to create SSH keys for github
Be sure you load the public keys you create into GitHub.
Tip
You can (re)use an existing key on your development computer, so long as it meets the requirements of Acquia.
City of Boston recommend the Ubuntu 16.04 or later distribution. While other Linux distributions will operate well, the instructions below assume the use of Ubuntu and, in particular, the apt
package manager.
Check Docker pre-requisites.
If using PHPStorm, install Docker-machine
At their core, Mac operating systems are similar to Linux and therefore the same basic steps apply to Macs as they do for Linux.
Git is usually installed, and on most operating systems verifying is achieved by typing the command below at a terminal prompt. This process has the advantage of prompting to install git if its not there.
Enter the command below. This will install a brew-community version of Lando, including docker as explained here.
Using brew is quick and simple and will definitely get you started. If you later find that you have issues with Lando and/or Docker versions, then follow the instructions on this page under the title "Install DMG via direct download" to get the latest versions.
Because Drupal is most commonly installed on Linux servers, City of Boston DoIT does not recommend using Windows® as a developer machine due to the increased difficulty in emulating the most common Drupal production web server.
However, if you have no alternative, or harbor an unquenchable desire to use Windows® then the following best practices and instructions should get you headed in the right direction.
There are many IDE's capable of being used to write, verify and deploy PHP code. City of Boston do not endorse any particular platform, but have successfully used the following:
Notepad++ (basic text editor)
Sublime Text (improved text editor)
VIM (Linux-based advanced text editor)
Visual Studio Code (full IDE)
Eclipse (full IDE)
PHPStorm (full IDE)
Run phpcs on your custom modules
PHP CodeSniffer (https://github.com/squizlabs/PHP_CodeSniffer) is already included with our D8 project with composer. If you run a lando composer install
you should have it available at ./vendor/bin/phpcs
1. You need to specifically download the Drupal coding standards using the coder module. You can do this globally for your computer by running:
2. You need to make sure phpcs knows about your newly installed coding standard (note the path below assumes you're using Ubuntu, yours might be different on a mac):
3. Now you can run this manually against your custom modules:
If you're looking for more info, here's a good place to get started: https://www.drupal.org/docs/8/modules/code-review-module/installing-coder-sniffer
Contact the AWS administrator to get credentials for logging into the AWS console and (if necessary) interacting with AWS via the command line.
Once you have a login to the AWS console: if you wish to use the AWS-CLI, or use any other command line program which connects to AWS (e.g. git for CodeCommit) you will need to register/add an SSH key on your AWS-CLI account.
You can use an existing ssh key, or create a new one.
You need to install the AWS CLI if you, or a tool you use, needa to interact with AWS from the command line - for example:
To use terraform to maintain AWS
To deploy webapps to AWS
To modify AWS objects from the command line
Follow the instructions here.
You want to install the AWS-CLI on you local machine, not inside a container. Follow the Mac, Windows or Linux instructions accroding to the OS you are using.
Verify AWS is installed using LINUX console:
You should see an output something like:
If not then return to the "Install AWS-CLI" section above.
Obtain your secret access keys for AWS from the AWS administrator, and then create the AWS credentials file using the LINUX console:
Alternatively, you could create and edit the ~/.aws/credentials
file using any text editor.
Setting up the Visual Studio Code editor to work well with Drupal
Edit .vscode/launch.json
Add the following configuration:
Click in top navbar navigate to file > preferences > settings
Under Workspace Settings
expand the Extensions
option
Locate PHP CodeSniffer configuration
and scroll down to the Standard
section and click the "Edit in settings.json" link
Add the following configuration to your Workspace Settings
:
Under Extensions in the left sidebar, search for "PHP Debug" and click "Install"
Under Extensions in the left sidebar, search for "phpcs" and click "Install"
Three options for setting up a development environment on Windows.
Because Drupal is most commonly installed on Linux servers, City of Boston DoIT does not recommend using Windows® as a developer machine due to the increased difficulty in emulating the most common Drupal production web server. However, if you have no alternative, or harbor an unquenchable desire to use Windows® then the following best practices and instructions should get you headed in the right direction. There are 3 strategies to choose from:
This is the most complicated solution to setup, but allows the developer to use any windows-based tools desired to manage the Drupal codebase and databases.
The git repo is cloned to a local Windows folder on the Windows host. This repo folder is mounted into a Linux (Ubuntu) Docker Container (like a VM). Docker manages the virtualization and the container contains all the apps and resources required to host and manage the website locally for development purposes. Git commands are run either from the Windows host, or from the container. Lando (a container manager tool) provides a “wrapper” whereby commands (e.g. Docker, Lando, Git, Phing, Drush, Composer, SSH etc) are typed into a console on the Windows host, and Lando executes them inside the container. To be clear, with this strategy:
The container hosts the website
The developer normally changes/adds/removes Drupal files in the Windows folder on the Windows host
Changes to custom Drupal files (i.e. to files in the mounted folder) either on the host or in the container are immediately available to both the host and container without restarting docker or VMs
The developer normally runs dev tools such as Git, Drush, Phing and Composer in the container, using Lando commands
The Windows host does not require to have tools other than Docker, Lando and VBox or Hyper-V installed on it
Some developers still like to have git installed on the Windows host so their IDE tools (e.g. PHPStorm) can manipulate the repos directly
Developers’ need to interact directly with the container (i.e. via ssh) is minimized, and
This installation creates a developer environment suitable for a Linux-based production deployment.
Due to Lando requirements to use Docker CE (not Docker Toolkit), which in turn requires Hyper-V, you: NEED to have a Windows 10 64bit Professional or Enterprise version CANNOT use Windows 7 or earlier CANNOT use Windows Home or Home Pro as Hyper-V is required by Lando and does not ship with home versions.
These 6 steps are all performed on the host (i.e your Windows®) PC.
This is required to supply a Linux core which is needed by Docker to generate the necessary containers.
Install Windows Subsystem for Linux (preferred method)
These instructions also depend on having a current version of Windows® 10 (version later than Fall Creators Update
and pref build 16215 or later).
To install WSL support, do the following:
Open Windows Powershell as Administrator
Run:
Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux
Restart Windows when prompted
Taken from here
Install Linux Distro
DoIt suggests you install the Linux distribution from the Microsoft Store which most closely matches the Linux distro you will use on your production webservers. If you are unsure, install Ubuntu or Debian.
Install Hyper-V
If Hyper-V is not enabled when the Linux subsystem was installed (check by typing “Hyper-V” in the start menu), then follow these instructions.
If you are not using WSL, then Git for Windows provides a bash terminal for the Windows host. Installing Git for Windows is a convenient way to get this, and also gives the developer the option to directly execute git commands (against the repo) from the Windows host. This step is optional if you use WSL, or if you are confident with some other tool to provide a bash style console. Use Git for Windows from here. This is a good tutorial to step thru installation.
If you are using WSL and have enabled Hyper-V for your virtualization, then use the Docker “community version” from here - this link also guides you through an install.
Download the latest Windows .exe installer from here.
On Windows®, DoIT recommends:
In order to use VS Code for Drupal development, use this guide as a starting point. The editor is highly configurable with many extensions available. You will likely want to customize it further based on your needs.
Pickup from step 3 on the quick install guide.
This solution may be a quick and viable option if you have a powerful Windows machine to use as the host, and are not doing much development which required extensive use of an IDE. Depending on your setup, there may be issues with IPAddress routing, requiring complex configurations.
This method is not used by City of Boston DoIT, the preferred solutions on Windows machines are A or B.
For Windows® versions before 10 Fall Creators Update, we recommend that VirtualBox (free from Oracle) is used
For later versions you should use enable and use Hyper-V within Windows.
In the VM, install a Linux distro as close as possible to the production distro you will use, and unless you are very comfortable with the Linux CLI, be sure to install a distro with a GUI.
Once the Linux distro is installed, then follow the setup instructions for Linux.
Creates a drupal 8 container, mysql container and node container, and connects them all up.
For more detailed install and usage instructions for various platforms, see "More Help" below.
Ensure you have set up your development environment as described here:
On host computer, change directory to the repository root and use lando to create and start containers:
lando start
Depending on the power of the host machine, the Drupal 8 build process for boston.gov can take more than 15-20 minutes. The composer install and site install (esp. config import) tasks can take 5-10 minutes each - with no updates being directed to the console.
-> You can follow the process by inspecting the log files in docroot/setup/
there are links to these files in the console.
From the repository root (on host):
to view a list of available lando commands:lando
to view phing tasks: lando phing -l
to run drush commands:lando drush <command>
lando ssh
to login to docker container as www-data
lando ssh -user=root
to ssh and login as root
lando ssh <servicename>
servicename = appserver / database / node
To reduce typing at the console, you can add the following aliases to your ~/bashrc
, ~/.bash_aliases
or ~/.bash_profile
files on your development (host) OS.
With these aliases, typing (in a console) lls <folder>
will use lando to run ls -la <folder>
in the default container (in our case appserver) and list files there. Whereas, ls <folder>
will list the folder locally (i.e. on the host) as usual.
For more information on installation, usage and administration of the development area, go to the next section.
Clone the public into a local folder
git clone -b <branchname> git@github.com:CityOfBoston/boston.gov-d8.git
(City of Boston DoIT recommends that the develop branch be used)
Using Lando in City of Boston.
For our purposes, Lando is a PHP-based tool-set which does 3 main things:
Provides a pre-packaged local docker-based development environment,
Provides a wrapper for common docker container management processes,
Provides a means to execute development commands inside the container from the host.
Lando curates an appropriate LAMP stack for Drupal development, largely removing the need for this skill in the local development team. The stack is contained within:
Docker images that are maintained by Lando.
A configuration file (landofile) which Lando parses into the necessary dockerfiles and utility scripts
COB uses a landofile which can be found at /[reporoot]/.lando.yml
Lando provides a CLI for tasks developers commonly need to perform on the container.
A full list of defined Lando commands can be obtained by executing:
lando
Lando provides a CLI for tasks developers commonly need to perform in the container.
A full list of defined Lando commands can be obtained by executing:
lando
lando drupal-pull-repo
lando drupal-sync-db
lando drupal-pull-repo --no-sync &&
lando drupal-sync-db
lando rebuild
or to be completely sure, run these commands from the
lando destroy &&
rm -rf <repo-path>
git clone -b develop git@github.com:CityOfBoston/boston.gov-d8.git <repo-path>
lando start
Command
Explanation
Starts all 3 lando containers, building them if they don't already exist.
Stops all 3 containers, but does not delete or destroy them. They can simply be restarted later.
Will rebuild the container using the values in the .lando.yml
and .config.yml
files.
If the containers have persistent images, these will be reused.
Any content in the database will be lost,
Project files cloned/managed by git will be left intact.
Will destroy the container.
If the containers have persistent images, these will be retained.
Any content in the database will be lost,
Project files cloned/managed by git will be left intact.
Command
Explanation
Opens a bash terminal on the appserver docker container.
If the -c switch is used,
lando ssh -c "<command>"
then a terminal will be opened, the command provided will be run in the container and then the session will be closed.
eg: lando ssh -c "ls -la /app/docroot/modules/custom"
lando drush
Executes a drush cli command in the appserver container:
lando drush <command>
eg lando drush status
Note: a drush alias can be passed like this:
lando drush @alias <command>
eg: lando drush @bostond8.prod en dblog
lando drupal
Executes a Drupal cli command in the appserver container:
lando drupal <command>
lando composer
Executes a Composer command on the appserver container:
eg: lando composer require drupal/paragraphs:^1.3
lando drupal-sync-db
Executes a cob script which copies the database from the stage environment to the local development environment, and sync's all the configurations etc.
lando drupal-sync-db
lando drupal-pull-repo
Executes a cob script which pulls the latest project repository from gitHub and then clones and merges the private -repository. Finally it runs sync tasks to update the DB with any new configurations.
lando drupal-pull-repo
To update the repo's without sync'ing the content, execute:
lando drupal-pull-repo --no-sync
lando validate
Locally runs the linting and PHPCode sniffing checks that are run by Travis.
lando switch-patterns
Allows you to switch between patterns CDN hosts.
lando switch-patterns 2
switches to the local CDN in the patterns container
lando switch-patterns 3
switches to the production CDN
lando switch-patterns 4
switches to the stage patterns CDN.
For some people working within lando containers slows down and crashes their environment. To fix this they can work outside lando containers (patterns.lndo.site) and directly with localhost:3030
The local development version of the CDN is hosted by Fleet at http://localhost:3030. This local CDN is served (by Node/Fractal) from your local environment.
Super-powers are granted randomly so please submit an issue if you're not happy with yours.
Once you're strong enough, save the world:
If the installation has completed without errors, then you should be able to check the following:
The repo that was checked out in Step 1 of the installation instructions is hosted on your dev computer, and is mounted into each of the docker containers. As you make changes to the files on your dev computer, they are instantly updated in all of your local docker containers.
The production/public website is hosted by Acquia and can be accessed at https://www.boston.gov.
The local development version of the public website can be viewed at: https://boston.lndo.site. This local copy of the Drupal website is served (by Apache) from the appserver
docker container, and its content is stored and retrieved from a MySQL database in the database
docker container.
You will find the CityOfBoston/patterns repo cloned into the root/patterns
folder on your host dev computer.
The production/public patterns library is hosted by City of Boston from our AWS/S3 infrastructure and can be accessed at https://patterns.boston.gov.
The local development version of the patterns library is hosted by Fleet and can be viewed at https://patterns.lndo.site. This local copy of the Fleet website is served (by Node/Fractal) from the patterns
docker container.
You will find the CityOfBoston/patterns repo cloned into the root/patterns
folder on your host dev computer.
The gulp, stencil, fractal and other services running in thepatterns
docker container will automatically build the local fleet static website into root/patterns/public
from the underlying files in real-time as they are changed.
The production/public patterns CDN is hosted by City of Boston from our AWS/S3 infrastructure at https://patterns.boston.gov.
The local development version of the CDN is hosted by Fleet at https://patterns.lndo.site. This local CDN is served (by Node/Fractal) from the patterns
docker container.
You will find the CityOfBoston/patterns repo cloned into the root/patterns
folder on your host dev computer.
The gulp, stencil, fractal and other services running in thepatterns
docker container will automatically build the local fleet static website into root/patterns/public
from the underlying files in real-time as they are changed.
The following is a printout of the console from a typical build following the instructions on Installation Instructions page.
Specifically this output is from the command:
The log above was generated using lando start
with this.lando.yml
landofile.
The log above was generated using lando start
with thisconfig.yml
project file.
For developers using PhpStorm IDE how and where to update your settings/preferences to make debugging and developing in Drupal easier.
That's a tough question but thankfully, our team is on it. Please bear with us while we're investigating.
Yes, after a few months we finally found the answer. Sadly, Mike is on vacations right now so I'm afraid we are not able to provide the answer at this point.
Steps 1 - 7 must be completed while the computer is connected to the city network.
Using Windows POWERSHELL (as Administrator):
Launch POWERSHELL as administrator: search powershell
from Windows search
Alternative strategy
This may work without Windows requesting a restart at the end.
Using CMD (console):
To open a CMD console search for cmd
in the Windows search
Alternative strategy:
This may provide a more fault tolerant WSL environment when we are switching from City network to external network (because we are controlling where the distro is installed, and its not on the user's profile).
Using LINUX (WSL) console:
To get the Linux console, open a CMD console, type: wsl
These configuration files tweak the WSL environments to enable a better developer experience based on a standard CoB laptop configuration (i.e. minimum i7 chip, 32GB RAM and SSD harddisk).
Using a POWERSHELL console from the windows host:
Using a LINUX console (WSL):
Using LINUX console
If you have accessing the internet from WSL first try RESTARTING the computer.
If that does not work, using a LINUX console try:
=> then restart the computer.
Mount your development folders into WSL using the LINUX console:
Replace c:/Users/xxxx/sources
with the location in the windows host where you plan to keep all development source files.
This is the folder where you will be cloning the CoB repos.
If in doubt, create a sources
folder in your windows home folder, and for the command above just replace xxxx
with your CoB supplied EmployeeID/User Account.
Replace yyyy
with the accountname you used when you installed WSL (you can find this in the LINUX console by running cd ~ && pwd
- the path displayed be in the format /home/accountname
Double click the installer to launch: + Click OK to accept non-windows app, + Select WSL2 as the backend (rather than Hyper-V)
Docker desktop does not automatically start after the install, you need to start it the first time from the Start menu.
Restart your computer after this step.
If you do not, and subsequently restart the computer while off the city network, your installation will be broken, and you will have to remove Docker and WSL, and start over.
(see "Docker Fails to Restart" notes below to fix broken/non-functional WSL installs)
Verify AWS is installed using LINUX console:
You should see an output something like:
aws-cli/2.7.4 Python/3.9.11 Linux/5.10.102.1
.....
Obtain your secret access keys for AWS from the AWS administrator, and then create the AWS credentials file using the LINUX console:
Alternatively. you could also create and edit the credentials file using vim
which is installed in the WSL instance (from step 5 above).
Add your ssh keys to into your windows account (typically into a windows folder on you home drive) and then from a LINUX console:
Replace xxxx with your EmployeeID/User Account from CoB.
Microsoft Visual Studio Code (VSC)
PHP Storm
Using POWERSHELL:
Using POWERSHELL:
Using LINUX console:
Replace xxxx
with your CoB supplied EmployeeID/User Account.
Replace yyyy
with the accountname you used when you installed WSL.
Using LINUX console:
Replace yyyy
with the accountname you used when you installed WSL.
Using LINUX console:
Using Powershell (as Administrator):
From Powershell console reinitialize WSL:
From LINUX (WSL) console reset the nameserver so you can access the internet:
Where X.X.X.X is the IPAddress: 8.8.8.8 (confirm if there should be a different address) when in the office and 10.241.241.70 when not on the city network but using a VPN.
If, when restarting the computer, Docker fails to start and/or you get the following error when starting WSL:
The service cannot be started, either because it is disabled or because it has no enabled devices associated with it.
To fix this, perform the following steps.
Step 1: Using Powershell (ps) as Admin:
Step 2: Then using a CMD shell (as Admin)
Step 3: Restart Docker for Windows from the start menu.
@see
Download installer from h
This is a series of videos around site building and administration tasks. The individual videos in the series are listed in the upper right panel of the screen:
https://www.youtube.com/watch?v=R1ivRsz_urk&list=PLpVC00PAQQxGwyvUD_tYcBbLJqRC1CZ6U
City of Boston use Acquia to host our Drupal website.
Acquia provide a number of different environments for COB to use. One of those environments is production the others are non-production - named: stage, dev, uat, ci & dev2.
Detail on deployment is covered elsewhere, but in summary we are able to "bind" certain branches of our GitHub repo (CityofBoston/boston.gov-d8) to these Acquia environments, and when changes occur in those branches, a deployment is automatically triggered.
Therefore, the way we branch-off, push-to and merge the "bound" branches is important.
The develop
branch is bound to the Acquia dev environment, and the master
branch to the stage environment. Changes cannot be made directly onto the master
branch, and changes should not be made directly onto the develop
branch - except when hotfixes are needed.
Best Practice is to create a working branch off develop
, then check out that working branch
locally.
Updated code should be committed to the locally checked out copy of the working branch
Updating the local working branch
will update the local containerized website for testing.
Periodically, the local working branch
should be pushed to the remote working branch
in GitHub.
Updating the working branch
in GitHub will not trigger any deploys or update any website.
To start the deploy to the dev environment, a PR is created in GitHub to merge the working branch
in GitHub into the develop
branch in GitHub.
Merging will trigger a build and the website on the dev environment will be updated.
When ready to deploy to the stage environment, a PR is created in GitHub to merge the develop
into the master
branch in GitHub.
Merging will trigger a build and the website on the stage environment will be updated.
To deploy to the production environment, use the Acquia Cloud UI - see continuous deployment notes.
We can bind a branch to the dev2, ci or uat environments so that we can share proposed or interim website changes with stakeholders or other individuals where a local containerized website is not appropriate. These environments can be considered on-demand, and the way to update them is similar but slightly to the normal deploy piepline, requiring an extra branch.
Branches attached to environments other than dev, stage and production in Acquia are termed environment branches (see also On-Demand Instances).
Initially, an environment branch
is created from the develop
branch.
This environment branch
is then bound to the desired Acquia environment (dev2, ci or uat).
Developers then create a working branch
off the environment branch
and check out that working branch
locally.
Developers commit their work to the local copy of the working branch
which can be pushed to the remote working branch
in GitHub whenever desired.
Updating the local working branch
will update the local containerized website for testing.
Updating the working branch
in GitHub will not trigger any deploys or update any website.
When ready to update the website on the bound environment, using a PR, the GitHub copy of the working branch
is merged to the environment branch
in GitHub.
Merging will trigger a deploy to the bound Acquia environment (i.e. dev2, uat or ci) and update the website on that environment.
Stakeholders can be directed to the website on the Acquia environment.
Once the project or piece of work is complete, a PR to merge the GitHubenvironment branch
to the develop
branch is created.
Merging will trigger a deploy to dev and update the website.
To continue to deploy to stage and production environments, follow the notes in Normal Deploy Pipeline above.
Sometimes a picture is worth 1,000 words.
In the above diagram,
Lines with an arrow indicate a merge to the branch in the direction of the arrow.
Lines with a dot connector indicate the creation (or updating) of a branch - and when the line is to a local branch it is a checkout to a local branch.
The master
branch is the production branch and cannot be pushed/merged to directly.
The correct way to update master
is to merge the develop
branch into the master
branch.
At all times the master
branch should be a copy of the code on the production environment. (see continuous deployment)
Green arrows cause a deployment process:
Only if the branch being merged into is bound to an Acquia environment, and
This is controlled/executed by Travis, taking approx 3 mins (uses 30 Travis credits), and
The website hosted on the Acquia Environment is updated during the deploy.
Orange arrows cause a build, test and deployment process:
Only if the branch being merged into is bound to an Acquia environment, and
This is controlled/executed by Travis, taking approx 30 mins (uses 300 Travis credits), and
The website hosted on the Acquia Environment is updated during the deploy.
Travis is configured so that this is extended process usually only runs when committing to the develop
branch - triggering a deploy to the Acquia Dev environment as the first step of the deployment pipeline.
Black arrows indicate a simple commit/merge process with no building or deploying:
Best practice reuquires that a working branch
is not bound to Acquia Environments
Merging does not trigger Travis, there is no deploy and 0 Travis credits are used
Note: A GitHub environment branch
can be bound to one or more Acquia Environments. When this is the case, deploys will occur simultaeously to all bound environments when the GitHub environment branch
is updated.
Travis always controls deploys, but only one set of credits is used per environment branch
merge regardless of how many Acquia environments it is bound to.
Custom theme which presents the front-end UI to all users. .
Breadcrumbs are an informative device which appear on many pages on the site. Breadcrumbs provide the user a sense of location within the site and a way to logically navigate back to the homepage.
A breadcrumb is an ordered collection of crumbs, with each crumb having a title and a link.
Drupal has a built-in breadcrumbs methodology, which will attempt to build out a pathway based on the URI (e.g. /departments/housing/metrolist
) defined by the pages (i.e. nodes) URL Alias.
It does not matter if the URL Alias is set manually or automatically, the value shown in the back-end editor form once the node is saved is used to build out the breadcrumb.
The Drupal core process creates the breadcrumb by scanning the path represented by the URI, and testing if a local page exists for each path element. It stops adding crumbs when a path element does not resolve.
FOR EXAMPLE an article is created with a URI (as defined in its URL Alias):
/departments/housing/boston/housing-information-in-boston.
When the page is rendered, Drupal scans the articles URI and
if we have a breadcrumb setting which stipulates that the homepage should always be shown as the first crumb, then a crumb of home
with a link to https://site
is created, then
checks if /departments
is a valid URI. https://site/departments
is a valid URI, so it creates a crumb of "departments" with a link to https://site/departments
, then
checks if /departments/housing
is a valid URI. https://site/departments/housing
is a valid URI, so it creates a crumb of "housing" with a link to https://site/department/housing
, then
checks if /departments/housing/boston
is a valid URI. https://site/departments/housing/boston
is NOT a valid URI - there is no page with that name on https://site
so the breadcrumb scanner stops evaluating at this point, but
if we have a breadcrumb setting to display the actual page in the breadcrumb then a final crumb of housing information in boston
is added, with no link (because this is the page showing).
The final breadcrumb in this instance would be HOME > DEPARTMENTS > HOUSING > HOUSING INFORMATION IN BOSTON with links on the first 3 crumbs.
When evaluating if a page exists on the site, Drupal only considers URL Aliases and does not check URL Redirects.
So in the example above, the boston
crumb/link still would not appear in the breadcrumb even if a place_profile
page for Boston existed with the URL Alias of /places/boston
and a URL Redirect for /departments/housing/boston
.
Where Drupal core cannot build out its own breadcrumb trail, there is some additional custom code intended to help make a logical breadcrumb.
The custom breadcrumb code only functions when it determines that Drupal has not built out the entire breadcrumb.
If Drupal has been able to build out all parts of the URI path, then the Drupal breadcrumb is used.
The custom code scans URL redirects as well as URL Aliases when building out the breadcrumbs.
Care: Redirects which are manually made on the page admin/config/search/redirect
are usually considered "external" by default. Breadcrumbs which use an external link may behave unexpectedly when clicked.
Example: the breadcrumb on d8-dev.boston.gov may open a page on www.boston.gov when clicked.
Solution: Do not create redirects for internal (i.e. Drupal hosted) pages on in the admin/config/search/redirect
page. Instead create redirects using the redirect function on the "advanced" tab of the editor form for a page.
Some URI paths are hard-coded to build specific breadcrumbs.
For example pages which have a URI path starting with government/cabinet
. The custom code ignores the "government/cabinets" part of the path and then build the breadcrumb from the remainder of the path.
The custom breadcrumb object is built here: bos_theme/bos_theme.theme::bos_theme_preprocess_breadcrumb()
The breadcrumb is styled here: bos_theme/templates/navigation/breadcrumb.html.twig
When you make an html.twig
file and add it to the templates folder of a custom theme you are pretty much done (after refreshing caches!). The Drupal theme rendering processes detect the template and uses it in preference to any template of the same name from a parent or default theme. You don't really have to do anything more than add the file and refresh cache.
But, if you add a template to a custom module -even if your intent is just to override a theme default template (e.g.field.html.twig
) or to provide a suggested template, there are a few extra things you must do.
Using the example of a custom content type (node) called "node_landing_page", the steps below fully implement a template to be used to render the nodes full
display.
Note: Drupal automatically generates the suggestion fornode__landing_page__full
which can be used for rendering the "default" (i.e. "full") display.
You can generate other suggestions using the hook_theme_suggestions_hook
hook.
Create the twig template you wish to use, and give it a name that matches an existing Drupal theme suggestion with ".html.twig" as the extension.
In rare cases you may want to create a new template suggestion. Do this by returning an array of suggestions from ahook_theme_suggestions_hook()
in your custom module (see last example below).
Convention is to name the template using an "entity breadcrumb" style, with "--"'s between entities and no spaces.
Save the template file in a folder called templates
in your custom modules root folder. In our example docroot/modules/custom/node_landing_page/templates
.
- You could organize files by creating a sub-folder tree - but if you do, you will then have to specify the path
to your template in the hook_theme
- see step 3 below.
In the hook_theme
of your module you must define your new template. This hook is read by the Drupal core theme engine and loaded into a template cache (aka register). Whenever a change is made to this hook you need to clear all caches to load your changes into the cache.
In hook_theme
return an assoc array with key-value pair nested arrays for each template you wish to define.
- The outer keys (template-keys) should be one for each of the templates you are defining. Keep it simple and traceable by setting the the template-key name to be the template filename without the ".html.twig". Important: Replace all "-"'s with "_"'s in the template-key string. (in our example the template-key is node__landing_page_full
)
- The value for the key (template-key) is an array with a required base_hook
and several other optional fields.
The base_hook
should define the entity type this template is used to render (in our case node
but other common entities we theme are field, region, block, paragraph, taxonomy_term
) .
[optional] The render element
defaults to elements
if not specified.
[optional] If you wish to use a template file which is not the same name as the suggestion (with "_"'s replaced with "-"'s) then you must specify its name in the template
field. Omit the "html.twig" extension. This could be useful if you want 2 display to share the same template.
[optional] If you want to use a custom path to the template file (i.e. not the default templates folder) then use the path
field.
(see bos_link_collections_theme
in boston.gov for example)
(see "Our Example hook_theme" below for the complete hook)
[optional] Once the cache is cleared you can then catch pre-process events using hook_preprocess_hook
in our example this would be node_landing_page_preprocess_node
(to catch all node pre-process events) or node_landing_page_preprocess_node__landing_page__full
(to catch only this new template pre-process events) - notice that the hook uses the template-key
defined in the hook_theme
array.
[optional] You can also catch template_preprocess_hook
events (in our example this is template_preprocess_node__landing_page__full
).
This hook is commonly used to create a content
variable which contains all the rendered (or renderable) elements of the elements
(or whatever the field is named in the templates render element
) array.
Our Example template file:
Our Example hook_theme:
Our Example hook_preprocess_hook (version 1):
Our Example hook_preprocess_hook (version 2):
Our Example template_preprocess_hook:
Our Example hook_theme_suggestions_hook:
Modules can contain multiple paragraphs grouped by similar function.
A good example module can be found at:
Module naming convention is to call the module bos_moduleName
. The "moduleName" should be indicative of the paragraph/s contained within the module.
Sub-pages in this section assumes an example module is to be named bos_module_name
- with the module folder:
Custom nodes deployed in boston.gov have a navigation menu which sits below the introduction text on each page.
The in-page menu requires the node to embed paragraphs, the node--xxxx.html.twig to contain a <div> and for each embedded paragraph to have a key field.
If the node has components (paragraphs) embedded, then the node will have a field called field_components
and this field will be of a type Entity reference revisions
. The field will allow only paragraphs, and will specify the paragraph types that are allowed on the node.
To enable in-page navigation, each paragraph must have a (text field) field_short_title
, and to reduce confusion for content editors, that field should be named "Navigation Title".
To make the menu look nice and work well on mobile devices, content editors and authors should be encouraged to keep the content added to the Navigation Title to 20 chars or less.
To enable the in-page navigation menu, the nodes template should include the following:
This block should ideally be located below the title and intro-text sections.
When there is more than 1 paragraphs embedded in a nodes web page, an in-page navigation menu should appear on the page. The menu should be styled from the patterns library.
UX Desktop: When the page first loads, the menu should display above the fold. As the user scrolls down the page, the menu should collapse into a fixed toolbar at the top of the page, below the seal menu with the seal retracted. Theme should come from patterns.
UX Mobile: Menu should appear as a collapsed set of drawers with a chevron icon to expand. Css from patterns controls the collapse across the responsive page width.
In either UX, when the user clicks on the menu, the page should scroll smoothly down to the correct paragraph display on the webpage.
The twig template (e.g. node--xxx.html.twig
) for the node is responsible for locating the menu on the node. The code required is described above.
On-page menu elements are rendered from the bos_theme_preprocess_node()
and bos_theme_preprocess_field()
hooks in bos_theme.theme
found in /themes/custom/bos_theme/
.
The page click and scrolling is provided by component-navigation.boston.js
which is found in /themes/custom/bos_theme/js/
.
To make a paragraph include itself in the in-page navigation menu, it just needs to contain a text field named field_short_title
(and for that field to be included in the display being used on the node).
Converting D7 structures to D8
Login to the website and go to the paragraphs admin page (/admin/structure/paragraphs_type
) and delete the paragraph you want to work on
Step 1 above may delete some of the field.storage dependencies (field definitions), so just re-import all the bos_component module config to make sure you get all the shared config back into the database: lando drush config-import --partial --source=/app/docroot/modules/custom/bos_components/config/install
Create the module scaffolding using drush, for example: lando drush componetize bos_discussion_topic --components=discussion_topic
Add hook_theme() to .module file to connect to the paragraph template
Copy the corresponding paragraph template from boston.gov-d8/docroot/themes/preConversion/component
and put it in the scaffolding that the drush command from step 3 created: docroot/modules/custom/bos_components/modules/bos_discussion_topic/templates
Enable the module: lando drush en bos_discussion_topic
In the Drupal UI, add the new bundle to the field_components
paragraph types list for the Test Component Page content type: /admin/structure/types/manage/test_component_page/fields/node.test_component_page.field_components
Create a test page with the component added to review admin UI and display
Importing a single config file:
Exporting database config directly to your module (Important: the config file needs to be referenced in your module's info file under the config-devel
key): lando drush config-devel-export bos_cabinet
City of Boston support development of discrete React (and other JS framework) WebApps. Because these services will be hosted on Drupal there is a custom Drupal webapp launcher and some conventions to
Have stable local build of Drupal 8 website running on your machine.
Make sure you are “logged in” or have “admin” access to view the CMS and add new content / nodes.
Using Drush: lando drush uli
Using Drupal web login: https://boston.lndo.site/user/login?local
Navigate to Content menu item (make sure you are logged into Drupal to view) https://boston.lndo.site/admin/content
Scroll to bottom of the page and add content item by clicking “Add Content”, . Select “Listing Page” content type.
Give new Page content a Title. This is required.
Click on the “Components” tab on the left menu
Find the dropdown Select menu to add “new component” and select “Web App” from the list.
Name the Web App something appropriate as it relates to your project. (i.e. Metrolist or My Neighborhood)
Click “Save” near the bottom or side of the page to save and create a new page / node. This will serve as the container page / component for your new web app.
Navigate to the “bos_web_app” directory of the drupal 8 repository that is checked out to your local machine /docroot/modules/custom/bos_components/modules/bos_web_app.
Locate the “apps” folder / directory. If one doesn’t exist, please create. /docroot/modules/custom/bos_components/modules/bos_web_app/apps
Inside this “apps” directory create an empty folder and name it the same name you called your Web App in Step 6 of Part 1 above. /docroot/modules/custom/bos_components/modules/bos_web_app/apps/my_neighborhood NOTE: Any spaces in your app name should be treated with underscores. For example, My Neighborhood would have a folder name of “my_neighborhood”.
Locate and open the libraries yml file named “bos_web_app.libraries.yml”. This file will serve as the pointer and compiler that will tell Drupal to attach and bunde all your JS and CSS files for your application. /docroot/modules/custom/bos_components/modules/bos_web_app/bos_web_app.libraries.yml
See an example libraries.yml file on GitHub for a project that is currently being developed. https://github.com/CityOfBoston/boston.gov-d8/blob/mnl_12-9-2019/docroot/modules/custom/bos_components/modules/bos_web_app/bos_web_app.libraries.yml Drupal also has good documentation on using libraries and attaching files. https://www.drupal.org/docs/8/creating-custom-modules/adding-stylesheets-css-and-javascript-js-to-a-drupal-8-module
Once you have libraries file setup, go create the files needed, OR first create the files you’d like and then add them to the libraires.yml as laid out in Part 2 - Step 5 above.
It’s important to note that any time you add a new attached file to libraires.yml, the Drupal cache will have to be cleared for changes to take effect. You can clear the cache either through the Drupal CMS or via Drupal drush CLI
Drupal CMS: navigate to admin/config/development/performance, and click button at top of the page labeled “clear all caches”
Using Drush CLI: drush cr
After clearing the cache, you should now see your application load on the Drupal page you created and saved in Part 1 - Step 7. NOTE: You will NOT have to clear the Drupal cache every time you make a change to a CSS or JS file. This is only for new items in the libraires.yml file.
Once you have the libraries file open, add an entry with the name of your application and add / attach necessary items to your application. For example, the application “My Neighborhood” would have a library entry as such…
Entity
Field
min/max resolution & max filesize
View: Style
Images
node:department_profile
field_icon
56x56/++ - 200KB
default: (i) square_icon_56px Article: (i) square_icon_56px Card: (i) square_icon_56px Article: not displayed Published By: (i) square_icon_56px
node:event
field_intro_image
1440x396/++ 8 MB
default: (b) intro_image_fields featured_item: (i) Featured Item Thumbnail
field_thumbnail
525x230/++ 8 MB
default: (b) thumbnail_event featured_item: (p) thumbnail_event
node:how_to
field_intro_image
1440x396/++ 8 MB
default: (b) intro_image_fields [all others (10)] not displayed
node:listing_page
field_intro_image
1440x396/++ 8MB
default: (b) intro_image_fields [all others (12)]: not displayed
node:person_profile
field_person_photo
350x350/++ 5MB
default: (p) person_photos listing: (p) person_photos embed: (p) person_photos
node:place_profile
field_intro_image
1440x396/++ 8MB
default: (b) intro_image_fields Listing: (p) card_images Teaser: not displayed
node:post
field_intro_image
1440x396/++ 8MB
default: (b) intro_image_fields featured_item: not displayedListing: not displayed Listing short: not displayed Teaser: not displayed
field_thumbnail
700x700/++ 5MB
default: not displayed featured_item: (p) featured_images Listing: (i) News Item -thumbnail (725x725) Listing short: (i) News Item -thumbnail (725x725) Teaser: (i) News Item -thumbnail (725x725)
node:program_i_p
field_intro_image
1440x396/++ 8MB
default: (b) intro_image_fields listing: (b) card_images
field_program_logo
800x800/++ 2MB
default: (p) logo_images Listing: not displayed
node:site_alert
field_icon
56x56/++ - 200KB
default: (s) n/a svg (square_icon_56px) Embed: (i) square_icon_56px Teaser: not displayed
node:status_item
field_icon
65x65/++ - 200KB
default: (s) n/a svg (square_icon_65px) listing: (s) n/a svg (square_icon_65px) teaser: (s) n/a svg (square_icon_65px)
node:tabbed_content
field_intro_image
1440x396/++ 8MB
default: (b) intro_image_fields
node:topic_page
field_intro_image
1440x396/++ 8MB
default: (b) intro_image_fields featured_topic not displayed listing_long: (b) intro_image_fields listing: (b) card_images
field_thumbnail
default: not displayed featured_topic (p) featured_images: not displayed listing: not displayed listing_long: not displayed
para:card
field_thumbnail
670x235/++ 2MB
default: (b) card_images
para:columns
field_image
200x200/++ 2MB
default: (i) Med Small Square (also Person photo a-mobile 1x (110x110))
para:fyi
field_icon
56x56/++ 200KB
default: (s) n/a svg (square_icon_56px)
para:hero_image
field_image
1440x800/++ 8 MB
default: (b) Hero fixed image fields Separated Title: not displayed
para:map
field_image
1440x800/++ 8 MB
default: (b) Photo Bleed Images
para:photo
field_image
1440x800/++ 8 MB
default: (b) Photo Bleed Images
para:quote
field_person_photo
350x350/++ 5 MB
default: (i) Person photo a-mobile 1x (110x110)
para:signup_emergency_alerts
field_icon
n/a svg
default: (s) n/a svg (square_icon_65px)
para:transactions
field_icon
180x100/++ - 2MB
default: (i) transaction_icon_180x100 group_of_links: (i) transaction_icon_180x100
para:video
field_image
1440x800/++ 8 MB
default: (b) Photo Bleed Images
tax:features
field_icon
svg
default: (s) n/a svg (square_icon_56px) sidebar_right: (s) n/a svg (square_icon_56px)
entity:user
user_picture
100x100/1024/1024 1 MB
default: (p) person_photos compact: (i) Person photo a-mobile 1x (110x110)
entity:media.image
image
+++/2400/2400 8 MB
default: (i) original image [all others]: (i) Media Fixed Height (100px)
Files
media.document
field_document
node:procurement
field_document
para:document
field_document
Breakpoint
Start width
end width
note
group: hero
mobile
0
419
tablet
420
767
desktop
768
1439
large
1440
1919
Introduced in D8
oversize
1920
+++
have a notional max-width of 2400px
group: card
mobile
0
419
tablet
420
767
desktop
768
839
desktop
840
1439
large
1440
1919
oversize
1920
+++
have a notional max-width of 2400px
group: person
mobile
0
839
tablet
840
979
desktop
980
1279
There is also a breakpoint at 1300 in node:pip
desktop
1280
+++
have a notional max-width of 2400px
Breakpoint
responsive Style
style
size
All Nodes: field_intro_image (excluding node:post)
hero: mobile (<419px)
intro_image_fields
Intro image a-mobile 1x
420x115
hero: tablet (420-767px)
intro_image_fields
Intro image b-tablet 1x
768x215
hero: desktop (768-1439x)
intro_image_fields
Intro image c-desktop 1x
1440x396
hero: large (1440-1919px)
intro_image_fields
Intro image d-large 1x
1920x528
hero: oversize (>1920px)
intro_image_fields
Intro image e-oversize 1x
2400x660
node:post field_intro_image
hero: mobile (<419px)
Hero fixed image fields
Hero fixed a-mobile 1x
420x270
hero: tablet (420-767px)
Hero fixed image fields
Hero fixed b-tablet 1x
768x400
hero: desktop (768-1439x)
Hero fixed image fields
Hero fixed c-desktop 1x
1440x460
hero: large (1440-1919px)
Hero fixed image fields
Hero fixed d-large 1x
1920x460
hero: oversize (>1920px)
Hero fixed image fields
Hero fixed e-oversize 1x
2400x460
para:photo field_image para:video field_image para:hero field_image para:map field_image
hero: mobile (<419px)
Photo Bleed Images
Photo bleed a-mobile 1x
420x250
hero: tablet (420-767px)
Photo Bleed Images
Photo bleed b-tablet 1x
768x420
hero: desktop (768-1439x)
Photo Bleed Images
Photo bleed c-desktop 1x
1440x800
hero: large (1440-1919px)
Photo Bleed Images
Photo bleed d-large 1x
1920x800
hero: oversize (>1920px)
Photo Bleed Images
Photo bleed e-oversize 1x
2400x800
find
card: mobile (<419px)
Card Images 3w
Card grid vertical a-mobile 1x
335x117
card: tablet (420-767px)
Card Images 3w
Card grid vertical b-tablet 1x
615x215
card: desktop (768-839px)
Card Images 3w
Card grid vertical c-desktop 1x
670x235
card: desktop (840-1439x)
Card Images 3w
Card grid horizontal c-desktop 1x
382x134
card: large (1440-1919px)
Card Images 3w
Card grid horizontal d-large 1x
382x134
card: oversize (>1920px)
Card Images 3w
Card grid horizontal e-oversize 1x
382x134
para:column
this should be a 200x200 circle ??
card: mobile (<419px)
Card Images 3w
Photo bleed a-mobile 1x
335x117
card: tablet (420-767px)
Card Images 3w
Photo bleed b-tablet 1x
615x215
card: desktop (768-839px)
Card Images 3w
Photo bleed c-desktop 1x
670x235
card: desktop (840-1439x)
Card Images 3w
Photo bleed c-desktop 1x
382x134
card: large (1440-1919px)
Card Images 3w
Photo bleed d-large 1x
382x134
card: oversize (>1920px)
Card Images 3w
Photo bleed e-oversize 1x
382x134
post:field_thumbnail(feature)
card: mobile (<419px)
Featured Images
Featured image a-mobile 1x
335x350
card: tablet (420-767px)
Featured Images
Featured image b-tablet 1x
614x350
card: desktop (768-839px)
Featured Images
Featured image c-desktop 1x
671x388
card: desktop (840-1439x)
Featured Images
Featured image d-full 1x
586x388
card: large (1440-1919px)
Featured Images
Featured image d-full 1x
586x388
card: oversize (>1920px)
Featured Images
Featured image d-full 1x
586x388
node:person_profile:field_person_profile user:user_picture
person: mobile (<839px)
Person Photos
Person Photos a-mobile 1x
110x110
person: tablet (840-979px)
Person Photos
Person Photos b-tablet 1x
120x120
person: desktop (980-1279px)
Person Photos
Person Photos c-desktop 1x
148x148
person: desktop (>1280x)
Person Photos
Person Photos d-full 1x
173x173
node:pip:field_program_logo
person: mobile (<839px)
Logo Images
logo square a-mobile 1x
672x672
person: tablet (840-979px)
Logo Images
logo square b-tablet 1x
783x783
person: desktop (980-1279px)
Logo Images
logo square c-desktop 1x
360x360
person: desktop (>1280x)
Logo Images
logo square d-full 1x
360x360
This document outlines to process for getting the Budget Fiscal Year website together. It has been shared with OBM (Office of Budget Management).
TestPage (article)
Event
Events Content (admin)
With header image
No header
Listing Page
Listing Page Content (admin)
Landing Page
Landing Page Content (admin)
homepage
Topic Page
Topic Page (Guides) Content (admin)
With Image
Place Profile
Place Profile Content (admin)
With Header Image
Person Profile
Person Profile Content (admin)
Program Initiative Profile
PIP Content (admin)
With Image
No Image
Post
Post Content (admin)
With Image
No Image
How To
How To Content (admin)
With Image
No Image
Article
Article content (admin)
Department Profile
Department Profile Content (admin)
Public Notices
Public Notice Content (admin)
Script Page
Script Page Content (admin)
TestPage (article)
Event
Events Content (admin)
With header image
No header
Listing Page
Listing Page Content (admin)
Landing Page
Landing Page Content (admin)
homepage
Topic Page
Topic Page (Guides) Content (admin)
With Image
Place Profile
Place Profile Content (admin)
With Header Image
Person Profile
Person Profile Content (admin)
Program Initiative Profile
PIP Content (admin)
With Image
No Image
Post
Post Content (admin)
With Image
No Image
How To
How To Content (admin)
With Image
No Image
Article
Article content (admin)
Department Profile
Department Profile Content (admin)
Public Notices
Public Notice Content (admin)
Script Page
Script Page Content (admin)
A DND Development Officer is able to create a Meeting object in Sales Force, with all the meeting information, and attach it to a Project in Sales Force. When CRON runs on Drupal it then will sync any new or updated Meetings from Sales Force with a Drupal BH Meeting. After the new meeting is crated in Drupal, we also creat a Drupal Event so that the meeting will be listed on the Boston.gov Events page. The Meeting is also then displayed on the corresponding Drupal BH Project.
BH Meeting Content Type: /admin/structure/types/manage/bh_meeting
Sales Force Mappings: /admin/structure/salesforce/mappings/manage/bh_community_meeting_event
Templates:
docroot/modules/custom/bos_content/modules/node_buildinghousing/templates/snippets/bh-project-meeting-notice.html.twig
docroot/modules/custom/bos_content/modules/node_buildinghousing/templates/snippets/bh-project-timeline-meeting.html.twig
Helper Functions (Pre-process, alters):
docroot/modules/custom/bos_content/modules/node_buildinghousing/node_buildinghousing.module
docroot/modules/custom/bos_content/modules/node_buildinghousing/src/BuildingHousingUtils.php
This feature allows Drupal entities to sync back and forth with Sales Force Objects via the Drupal Sales Force module. It is primarily used by DND to use the data and access that is already on DND's Sales Force server to automatically sync with the Boston.gov Drupal site. This is controlled by field mapping configurations in the Drupal Sales Force module. Currently, all syncing is scheduled to happen on Drupal CRON run, every 5 minutes, with only updated objects.
Sales Force Mappings:
Building Housing - Projects (/admin/structure/salesforce/mappings/manage/building_housing_projects/fields)
bh_project --> Project__c
Building Housing - Website Update (/admin/structure/salesforce/mappings/manage/bh_website_update/fields)
bh_update --> Website_Update__c
Building Housing - Project Update (/admin/structure/salesforce/mappings/manage/building_housing_project_update/fields)
bh_update --> Update__c
BH Community Meeting Event (/admin/structure/salesforce/mappings/manage/bh_community_meeting_event/fields)
bh_meeting --> Community_Meeting_Event__c
Building Housing - Parcels (/admin/structure/salesforce/mappings/manage/building_housing_parcels/fields)
bh_parcel --> Parcel__c
Building Housing - Parcels-Project Assoc (/admin/structure/salesforce/mappings/manage/bh_parcel_project_assoc/fields)
bh_parcel_project_assoc --> ParcelProject_Association__c
Sales Force Settings:
Building Housing - Projects (/admin/structure/salesforce/mappings/manage/building_housing_projects)
Building Housing - Website Update (/admin/structure/salesforce/mappings/manage/bh_website_update)
Building Housing - Project Update (/admin/structure/salesforce/mappings/manage/building_housing_project_update)
BH Community Meeting Event (/admin/structure/salesforce/mappings/manage/bh_community_meeting_event)
Building Housing - Parcels (/admin/structure/salesforce/mappings/manage/building_housing_parcels)
Building Housing - Parcels-Project Assoc (/admin/structure/salesforce/mappings/manage/bh_parcel_project_assoc)
Troubleshoot Sales Force connection issues
If Drupal and Sales Force are not connecting or syncing please check the Authorization from Drupal to Sales Force (/admin/config/salesforce/authorize/list). You may need to Re-auth or even make a new connection if you need to connect to a lower development or testing environment on Sales Force. If you need access to an instance contact DND's Sales Force developer/administrator.
If a single item is not syncing or if you need info about the Drupal to Sales Force connection you can view the list this admin page. If you edit the instance you then have the option to force pull or push the Drupal entity with the Sales Force Object. If there is an issue you should see an error message in the response. You can also find other useful info like timestamps and record ids.
Drupal Building Housing records are synchronized from MOH SalesForce on a schedule. Salesforce is the authoritative source, and data should not be added or changed in Drupal.
There are 6 synchronizations with Salesforce which run in the following order, every cron run (so every 5 mins) The order is important, because Projects must be created before Attachments & Website Updates before Meetings & Chatter postings.
Each synchronization process does the following: A Drupal Application runs a Salesforce API object query to identify any records in the SF object which have been deleted or which have their last updated date after a last updated date stored by Drupal for that SF object. The identified records are then added/updated or deleted in Drupal. At the end of the process Drupal updates its last updated date for that object with the latest SF updated date found in the import. This date is then used as a high-water mark for the next import cycle.
This synchronization imports Project records from Salesforce Project__c
object into Drupals' bh_project
entity.
This synchronization manages project stages, documents and messages to appear on the timeline. It extends and replaces the functionality for the Update__c object which is imported for legacy reasons in Building Housing - Project Update
.
There is only ever 1 Website Update (Website_Update__c
) record per Project (Project__c
) record in Salesforce.
There is a rule in Salesforce to stop multiple records which would potentially create confusion for project stages etc.
If multiple Website Update records do exist for a Project in Salesforce, then all records will be imported into Drupal, but ONLY the last (when ordered by createdDate) will be used in the Project Timeline.
There should be no new Update__c records being created in SF. However, there are legacy records containing data which must be included in Drupal. Even though we do not normally expect the sync to process these objects, the code is important if the data is to be recreated accurately and completely (for example if a Salesforce purge is performed).
This handles legacy TextMessages (now use chatter) and document attachments (now use Website Update Attachments).
This synchronization imports Community Meetings event records from Salesforce Community_Meeting_Event__c
object into Drupals bh_meeting
entity.
This is a simple mapping and the import does little except cleaning up any URLs and address fields.
The bh_meeting
record holds a reference to it's parent bh_update
which is linked to the bh_project
.
If a Meeting event is updated, or deleted in SF, then the associated record will be updated in Drupal, and if necessary will move on the timeline.
Sync Name | Drupal Destination | SF Origin |
---|---|---|
Building Housing - Projects
bh_project
Project__c
Building Housing - Website Update
bh_update
Website_Update__c
Building Housing - Project Update
bh_update
Update__c
BH Community Meeting Event
bh_meeting
Community_Meeting_Event__c
Building Housing - Parcels
bh_parcel
Parcel__c
Building Housing - Parcels-Project Assoc
bh_parcel_project_assoc
ParcelProject_Association__c
This page documents data sources and pipelines for maps on boston.gov that use the maps Drupal component.
This spreadsheet contains all maps in that have been created. If updating that spreadsheet gets away from you and you want to know what pages on boston.gov are using maps, ask a Drupal dev on the Digital Team to run this query:
That will give you an output that looks something like this JSON file (last ran 10/1/20). A "1" in the status field means they are published pages and publicly accessible, a "0" means they are not published and cannot be seen unless the user can log into the website.
It's important to note that maps will come on and off the website with time as many of them are seasonal or for certain events. Therefore, you may not always be able to see the maps on the pages linked to in the spreadsheet above or in the output of that query. If you need to assist in getting a map that once was on the site but now isn't one of two things can happen:
The Digital Team can get to old JSON configuration file from a past revision of the page and you can work off that to start
You should be able to find the JSON in this Google Drive folder which has every JSON config we've created for a map on boston.gov.
Below are the main steps in getting a map on boston.gov:
Get data
Make sure the data stays up-to-date
Create a map on boston.gov in Drupal
There are two main ways you can set up the data for maps on boston.gov:
Google Sheet that feeds into a hosted feature service
ArcGIS Online hosted feature service
For boston.gov to be able to access and render the dataset, it has to be https. This means anything published to ArcGIS Online via the EGIS database cannot be used for a boston.gov map since they have http feature service urls.
Feature service url for Landmarks published via EGIS database: http://gis.cityofboston.gov/arcgis/rest/services/EnvironmentEnergy/OpenData/MapServer/3
Feature service url for Landmarks hosted feature layer: https://services.arcgis.com/sFnw0xNflSi8J0uh/arcgis/rest/services/BLC_Landmarks_Hosted_Approved_Landmarks/FeatureServer
If the stakeholder you are working with is more comfortable editing and updating information in a Google Sheet, you can create a hosted feature service using this.
The Google Sheet you set up needs to have one of two things:
Latitude and Longitude fields (preferred method)
An address field that contains the locations entire address (e.g. 200 Heath St, Jamaica Plain, MA 02130)
Using latitude and longitude fields allows the connection between the Google Sheet and the created ArcGIS hosted feature service to be automated. If you use the address field, the Google Sheet and the feature service will not be connected and you'll have to manually update it every time it changes.
To create a new feature service using Google Sheets, you need to:
1. Download it as a csv file:
2. Re-name the file to whatever it is you want your feature service to be named.
3. Log into BostonMaps under the ETL developers username.
4. Create a new folder for this feature service.
5. Click on "Add Item" the "From Computer":
6. Choose the file you want to upload. Give it clear tags and confirm that it is using the correct fields for locations. If you are using addresses instead of Latitude and Longitude fields, make sure "ArcGIS World Geocoding Service" is selected.
Below is a screen recording of this process:
7. Once the feature service is created, you need to make sure the share settings are set to "public". If you don't do this, we won't be able to access the service from boston.gov.
If the stakeholder you are working with is comfortable editing hosted feature services in ArcGIS Online, you can use that as well. The only thing needed for the boston.gov maps is that the feature layer be shared publicly.
It is not ideal to have publicly accessible hosted feature services, therefore it is best practice to create a View of the feature service you are using. You can then set that view to be publicly accessible and use that for the map while the editable layer stays internal.
An example of this set up is with the COVID-19 testing locations:
You can create an ArcGIS web application that allows the stakeholder to edit and update the data if you want. Or, if they are comfortable just using the "Open in map viewer with full editing control", you can have them use that.
The pro of setting the map up this way is that when the stakeholder makes an update it is immediately and automatically updated on the boston.gov map.
If you are using an ArcGIS hosted feature service and having the stakeholder edit that, you just need to get them directions on how to add to a map with full editing control, they can then update the service.
If you are using a google sheet that has Lat/Long fields, you need to give the stakeholder instructions on how to update that information. Generally, we point people to google maps. Here is a screen recording of how to get the values.
Once you have that, you can submit a ticket to the Data Engineering team via this form. Be sure to include the link to the Google Sheet, the link to the hosted feature service in your request, and the frequency with which you want the pipeline to run (most if not all are nightly).
It might take a few days for the connection to get automated, and you may have a time sensitive map you are working on (e.g. early voting locations, covid testing sites, etc.). In this case, you can follow the instructions for manually updating a map below.
To manually update a map:
Download the spreadsheet as a csv file.
Rename it to match the hosted feature service.
Navigate to the hosted feature service's "Overview" page in ArcGIS Online.
Click on "Update Data" then "Overwrite Entire Layer"
Chose the csv file you downloaded are renamed.
Date fields can get read into ArcGIS Online as what I think are unix time stamps (number of seconds since Jan 1, 1970).
A hack-y fix for this is the add an additional row to the spreadsheet and just add "test" into the date field for that row. All other fields can be blank.
This extra field with force the dates to be parsed as strings in ArcMap. After you've updated the map, you can delete the test entry so you don't confuse the stakeholder, just remember to put it in again before you update it next time. Again, this is a hack-y solution! I do not believe this issue exists when lat/longs are used and the connection is automated.
You may have to work with your stakeholders to fudge some addresses to get them to show up on the map if you are relying on the ArcGIS World Geocoder. If you open map viewer in ArcGIS Online, you can search for addresses and figure out what one will get you closest to the location you are trying to map.
For example, mapping City hall with the ArcGIS world geocoder can lead to different results depending on the address entered:
The maps component on boston.gov is made for very simple, operational maps. Best practice is to keep the layers as minimal as possible - 1-2 layers max is best.
Any map icons need to be publicly accessible somewhere on the internet for us to be able to use them on a map. In most cases, we use icons available via the Digital Team's pattern library (Fleet). You will see maps the leverage icons that are not in that list, if that happens and you want to use it or you want any of the other experiential icons in this google drive location, email the Digital Team. They should be able to get it onto boston.gov and give you back a link to it.
We can cluster points on maps, we use leaflet's marker clustering functionality to do this. This is important to note, because this is one of the only ways to view points that are on top of each other. The other way is to leverage filtering, discussed below.
If you have a dataset or are mapping two datasets where points over lap, you need to move to location of one of the points if you don't want to use clustering to display them both. We do not have the functionality to click through multiple pop-ups like ArcGIS Online maps do.
For example, in the 2020 early voting map, we display both ballot dropboxes and early voting locations on the same map as two separate layers. There is a ballot dropbox at City Hall as well as an early voting location. The were originally sitting on top of each other until we updated the address for the early voting location.
Polygons and lines can only have one color. For polygons the color will be made transparent and fill the space. The outline of the shape will be the true color.
For lines, the line will be that color but as the user zooms in, it will get transparent so that street names are visible.
Polygons and lines can be set to change color when a user hovers over them:
On these maps we cannot color by attribute (e.g. locations with "Type" of "x" are blue, type "y" is red). If you encounter a situation where this is necessary, the best thing to do is create views in ArcGIS Online that filter by specific type. Then each view can be added to the map as its own layer.
The layers on this maps can be set up to be filtered by the user. Each filter will only work on one layer, but this can be used to help display points that may lay on top of each other.
For example, the food trucks map has multiple trucks in the same location various days of the week. We use filters with a default value of the current day to determine what trucks to show when the map is opened:
These filters can also be aware of each other. For example, the "Truck" filter above will only show trucks that show up on Fridays now that the Day has been set to that.
For certain maps, it is important that we use SAM addresses because the precision is extremely important. This generally the case when we are trying to tell people which polygon their address is inside (e.g. what City Council district you live in, whether or not your address is in a Historic District).
In these cases, the ESRI world geocoder is not precise enough as it may geocode addresses to the street centerline which could mean we tell someone they are represented by the wrong person.
These maps are configured using JSON. This folder contains all the JSONs that have been created for maps on boston.gov.
The best way to get started when creating a new map, is to look through this list and choose a map that has similar functionality to what you are going for. Then grab the JSON configuration linked there, and drop it into this map editor page. You can then start making updates to the JSON and when you want to see the change reflected on the map press the tab key and it will updated.
The embedded maps feature was written as a web component so that it could be added to the Boston.gov Drupal site but also any other web pages we make. It is built using the StencilJS reactive web component compiler.
For info about the <cob-map>
web component and its attributes, see the “Notes” pane in the <cob-map>
Fleet documentation.
The JSON configuration for the web component is specified in map-1.0.json.schema in the CityOfBoston/patterns repo. Auto-generated documentation for it can be found here: map-1.0 schema documentation.
Source code for <cob-map>
and our other web components is in the Fleet repo: CityOfBoston/patterns
Further information about the cityscore micro services.
There are 2 views which drive the get
requests for the endpoints.
The views have a dependency on the views_php
contributed module.
These views are defined at: /admin/structure/views/view/cityscore/edit
.
Display: Cityscore.
This display provides a full Drupal page containing the current cityscore table of metrics. The "page" URL for this is /rest/cityscore/html-table
. Because this is a fully themed Drupal page, it is more usual for the endpoint /rest/cityscore/html
to be used, as that endpoint provides just the <table>
HTML wrapped in a <div>
tag.
Display: JSON Output.
This display provides a JSON array with a single object being the current Cityscore total. The current Cityscore total is the mathematical average (mean) of all current non-null metrics.
This view is available at the endpoint /cityscore/totals/latest.json
. Because this is the output from a view, the JSON string is wrapped as an array. For backwards compatibility the endpoint /rest/cityscore/json
is used.
The endpoints provided at rest/cityscore/load
, /rest/cityscore/html
and /rest/cityscore/json
are all tracked on google analytics. Filter for pageviews named "/api".
A method of creating PDF's or modifying template PDF's was required to serve customized PDF's for various applications on the website.
The solution needs to be a generic service exposing functions and an interaction workflow to enable different (and new) applications to use the solution (to generate or modify the PDF's as needed) without rewriting the PDF Manager module itself.
A Drupal module is provided which can manage the generation of PDF's on boston.gov. The module provides a series of methods and properties which can be used to create, manipulate and access PDF's.
A limitation of all the PHP libraries found (and in general all open-source libraries for all platforms in our tech stack -i.e. php, javascript) was that the form elements in fillable pdfs were removed during processing. This meant that the libraries returned a flat non-fillable PDF even if the original document was fillable. A CLI application was found, PDFToolkit (pdftk) which provides specific functionality to manage fillable PDF.
The PHP library FPDF was leveraged to create and edit PDFs along with 2 extensions to allow the importing of existing PDFs and the creation of barcodes.
During phase 1, a Drupal module PDFManager was created which is capable of:
adding text or any color, size and supported font to a new or existing document,
overlay images onto an existing document,
generate a unique barcode or barcodes and overlay those onto an existing document,
update the pdf's document properties (author etc)
The module can create a new PDF or alter an existing PDF (e.g. a template). However, if a fillable form type PDF is used as a template, the form fields are stripped and the output from the module will be a flat file. This limitation is mostly removed in phase 2.
The pdf document manipulation is defined by a json file, and this file can be parameterized. Using the json file Drupal CMS content and/or content from an external database can be injected onto the form.
This module can be used by any other module in Drupal.
The CLI package PDFToolkit (pdftk) was leveraged and a City of Boston managed API was deployed in AWS as a microservice to create and edit fillable PDF's (fillable forms).
The pdftk runs from a Linux command line. The main Drupal site (served by an Acquia webserver), while running on Linux is not managed by City of Boston and the pdftk libraries are not loaded on that server. Given the short time constraints, the pdftk was deployed within the same container as the DBConnector, leveraging the existing endpoint services (node/javascript/express) and some shellscripting.
The PDFManager module functionality was extended using this microservice to:
insert text into a fillable field in the form
return a fillable form to the caller (provided a fillable form was used as a template)
With Phase 2, the module can modify and return a fillable form PDF, but still cannot create a fillable form, nor can it add (or remove) fields to a PDF.
The Drupal PDFManager module is found here:
Adding the code in that folder to a Drupal site, then enabling the module in Drupal is all that is needed to install it.
The actual document manipulations (for both phases) are done by the class PDFManager.
While the PDFManager module is a Drupal module, the actual PDFManager class itself has no dependencies on other Drupal code, and hence can be used in any other PHP application.
The PDFManager class is included in any other class or PHP script by referencing the namespaced module:
Generally the workflow is to create a new instance of the object, then to pass in static data regarding filenames and data to be applied to the document, and finally to generate the document.
This example shows a simple use of the PDFManager to complete flat PDF.
This example shows a simple use of the PDFManager to complete a fillable form PDF.
There are more complex examples in bos_assessing
- pdf.php
and pdf2.php
, these also show how a json file can be used for managing the text and barcode insertions, and how it can be parameterized so that data can be injected from a database.
Additional functionality can be added to the PDFManager in the future, for example to extend it to be able to create fillable pdfs from scratch, and/or to add fillable fields to an existing document.
To extend, simply modify the code in either the Fpdf.php
or PdfToolkit.php
classes as needed, and maybe to the PdfManager.php`
If a new PHP library or remote endpoint is utilized, it is recommended that a new class be created, as in the example below. This class would then need to be added to PdfManager.php
and code added and/or new methods exposed to utilize it.
All public methods from this class should have error trapping so that errors do not bubble up to PdfManager. When an error occurs capture the error into the class error variable and then return FALSE. Code in the PdfManager should then check for an error and act accordingly.
City of Boston use docker containers for local development.
Lando is used by the Drupal development team to manage the docker containers and provide basic tooling for the local development environment.
City of Boston use PostMark to relay emails.
Code for this service/endpoint is contained in the module bos_email
.
POST
/rest/email_token/create
Provides an email session token which must be supplied as a field when the form is submitted to therest/email_session
endpoints.
The session token is saved against the users session on the server.
POST
/rest/token/remove
Invalidates a previously created email session token.
POST
/rest/email_session/{server}
POST
/rest/email/{server}
Use email_session for additional security.
POST
/emails
When and if a new environment is setup on Acquia for CoB, the following steps should be followed:
When a new environment is added, it will have a 3-4 character name (e.g. uat
or dev2
etc). This checklist refers to this environment short-name as the envname.
This change adds the specified domains to the acquia-purge registry. This means the varnish cache for these domains will be automatically purged. If a sub-domain is attached to an environment and is NOT listed here, then it will not be automatically purged as content is changed.
This change directs the new environment to request images and files from a shared (linked) folder rather than the default sites/default/files
folder. The folder is linked to conserve file space as each environment basic requires the same sets of images and files.
The following steps need to be completed to allow single sign on via Ping Federated.
Attach a GitHub branch to an Aquia environment.
Acquia provide 6 environments to CityOfBoston.
This process has been decommissioned and some of the processes below are no longer implemented in scripts.
This page is left here only to provide background should COB decide/require to have Drupal in an AWS managed container.
You can push your local repository up to a test instance on our staging cluster on AWS. This will let you show off functionality using data from a staging snapshot of Boston.gov.
You will need a full development environment and Drupal 8 installed on your local machine (refer to earlier notes).
Get a “CLI” IAM user with an access key and secret key.
Use aws configure
to log your CLI user in locally. Use us-east-1
as the
default region.
Request your CLI IAM user credentials from DoIT.
To push your local repository up to the cluster, run:
Where <variant>
is the variant name you created in CityOfBoston/digital-terraform
.
This will build a container image locally and upload it to ECR. It will then update your staging ECS service to use the new code.
By default, the container startup process will initialize its MySQL database with a snapshot of the staging environment from Acquia.
After the container starts up and is healthy, the doit
script will print useful URLs and then quit.
Direct SSH access is not generally available on the ECS cluster. To run drush
commands on your test instance, you can visit the webconsole.php
page at its domain. This will give you a shell prompt where you can run e.g. drush uli
to get a login link.
The webconsole.php
shell starts in docroot
.
Talk to another DoIT developer to get the webconsole username and password.
NOTE: Each time you deploy code to your test instance it starts with a fresh copy of the Drupal database.
If you want to preserve state between test runs, log in to webconsole.php
and run:
(The ..
is because webconsole.php
starts in the docroot
.)
This will take a snapshot of your database and upload it to S3. The next time your test instance starts up, it will start its sync from this database rather than the Acquia staging one.
The database will also be destroyed when the AWS containers are restarted for any reason. It is good practice to stash your DB regularly.
To clear the stash, so that your database starts fresh on the next test instance push, use webconsole.php
to run:
Here is a snapshot of the doit script referred to above.
Elsewhere this might be termed spinning up an on-demand instance of the site.
Make sure you have the latest copy of the main Drupal 8 repository cloned to a folder <repo-root-path>.
Checkout the branch develop
and make sure the latest commits are pulled (fetch+merged) locally.
Commit your work to a new branch (on-demand-branchname
) off the develop
branch .
Push that branch to GitHub, but do not create a PR or merge into develop
.
Edit <rep-root-path>/.travis.yml
file and make the following additions:
(Note: replace <on-demand-branchname>
with on-demand-branchname
.)
Edit <rep-root-path>/scripts/.config.yml
file and make the following additions:
(Note: This partial example addition is configured to deploy to the Ci environment on Acquia)
(Note: replace <on-demand-branchname>
with on-demand-branchname
.)
Commit the .config.yml and .travis.yml
changes to on-demand-branchname
and push to GitHub - but do not merge into develop
.
Make a small inconsequential change to the code and commit to the on-demand-branchname
branch, and push to GitHub. This will cause the first-time build on Travis, and deploy into the on-demand-branchname-deploy
branch in the Acquia Repository.
The "on-demand" environment is now set. Users may view and interact with the environment as required. See Notes in "gotcha's" box below.
Once you have finished the demo/test/showcase cycle, you can merge the on-demand-branchname
branch to develop
- provided you wish the code changes to be pushed through the continuous-deploy process to production
.
Finally you can detach the on-demand-branchname
branch from the Acquia environment, and set it back to the tags/welcome
tag.
You can direct users to the URL's below, select the environment you switched to the on-demand-branchname-deploy
branch (in step 8) from the table below.
Housekeeping.
When finished with the environment, you should consider rolling-back the changes you made to .travis.yml
and .config.yml
in steps 4 & 5 before finally merging on-demand-branchname
to develop.
It is likely that the on-demand instance is no longer required, and its unnecessary for the the on-demand-branchname
to be tracked by Travis.
Also as a courtesy, change the branch on the environment back to tags/WELCOME
so it is clear that the environment is available for use by other developers.
Deploying: If you switch the code on the Acquia server from on-demand-branchname-deploy
to some other branch or tag, and then back again - then in Acquia's terminology each switch of branch is a "deploy" of the code. GitHub is not affected by this change, so nothing will run on Travis, but once each switch is complete, Acquia'spost-code-deploy
hook script will run.
- That deploy-hook script will sync the database from the stage
environment and will overwrite any content in the database. Therefore, any content previously added/changed by users will be lost.
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
In the CoB Drupal 8 edit the Acquia Hook scripts in the /hooks/common
folders, looking for the following code in thepost-code-update.sh
and post-code-deploy.sh
files:
In the CoB Drupal 8 find the following line in the script at/hooks/common/cob_utilities.sh
file (approx line 270):
In the CoB Drupal 8 find the following line in the script at/sites/default/settings/hooks/common/cob_utilities.sh
file (approx line 63):
In the CoB Drupal 8 edit the drush environment definition script in the /drush/sites/bostond8.site.yml
file. Add a new entry at the bottom of the file:
In the .htaccess
file in the CoB Drupal 8 alter the domain filters as follows:
Everywhere you see this pattern in the file:
Login to the and click on the bostond8 application.
To use the environment as a Drupal site, you need to attach a branch from the Acquia git repository. For detailed instructions see section.
On demand instances of the Drupal site (boston.gov) are useful to demonstrate new features or functionality sand-boxed away from the
These on demand versions of boston.gov are designed to be housed on a near-duplicate environment to the production site, and be in a normal browser from anywhere by people with the correct link.
The dev, stage(test) and prod
environments are associated with git branches used in the and can not be attached to different branches or repository tags without disrupting and potentially breaking the workflow.
The dev2, dev3, ci and uat
environments can track any desired branch or tag (even develop-deploy
or master-deploy
) without disrupting the .
Install the .
To create a place to upload your code, follow the instructions in the repository to make a “variant” of the Boston.gov staging deployment.
The Travis build can be .
Login to the Acquia Cloud console. In the UI switch the code in the Ci/Uat environment to the on-demand-branchname-deploy
branch.
This will cause a , which will copy across the current stage
database and update with configuration from the on-demand-branchname
branch.
Updating: If you push changes to on-demand-branchname
in GitHub (which eventually causes Acquia'son-demand-branchname-deploy
to be updated) - then in Aquia's terminology you are "updating" the code.
Any commits you push to the GitHubon-demand-branchname
will cause and update the code on the and this will cause Acquia'spost-code-update
hook script to run.
- That update-hook script will backup your database and update and new configurations but will not update or overwrite any content (so changes made by users will be retained).
authorization*
String
token {token}
email["token_session"]*
String
The session token.
email["to_address"]*
String
The recipient.
email["from_address"]*
String
The sender.
email["subject"]*
String
Subject for email.
email["message"]
String
Body for message (contactform)
email["useHtml"]
Int
Should mail be HTML format? 1 or 0
email["template_id"]
String
Use a POSTMARK template
email["cc"]
String
CC recipients for email.
email["bcc"]
String
BCC recipients for email.
email["{string}"]
String
Any other fields required by templates.
Environment | URL |
uat | (public DNS entry) |
ci | (public DNS entry) |
dev2 | (no DNS - make entry in local hosts file) |
dev3 (pending) | https://d8-dev3.boston.gov (no DNS - make entry in local hosts file) |
General familiarity with the Drupal platform is a baseline requirement for anyone working on the site. We suggest reading through the following user guide to Drupal 8:
Additionally, Acquia's free training program Acquia Academy offers a series of Youtube video tutorials which can be found here, including a Drupal 8 Beginner's Course:
We use 2 custom themes, one which presents the backend and one which presents the front-end.
You will use the Git as the version control system for the City of Boston website and to manage code in the Acquia Cloud environment. If you are not already familiar with Git, you will want to check out this in-depth Git series. The first seven videos will give you most of what you need to know:
Introduction to the Git Series
What is Version Control?
Installing and Configuring Git (Dev Desktop comes with Git)
Getting Help with Git
Git Crash Course
Working with Git Branches and Tags
Moving through Git History
The next fourteen videos are more advanced Git topics, so you might want to save those for a later time.
You will see the listing of videos in the series in upper right panel; there are also a couple of different levels of scrollbar, so it’s easy to miss the later videos.
Caching considerations for Drupal with Acquia
Memcache is not used on boston.gov (at this time).
You can inspect the headers of requests to a webserver to see if varnish is enabled, and if content was served from the Varnish and/or Drupal caches.
This terminal command will return the headers from a request to a URL:
Examples:
Is "passive" caching: Varnish is not aware of the origin of html content it serves/caches.
Is outside of the Acquia load-balancers and is the first cache a user request hits.
Does not cache content for authenticated users.
On boston.gov, the Acquia Purge module is configured to remove entities (pages) from the Varnish cache as they are updated by content editors in Drupal. This invalidation process uses queues in Drupal. The Drupal queue processor is triggered by cron and runs until the queue is exhausted.
On production
cron runs every 5 minutes,
so (if there is no active queue) it could take up to 5 minutes for content changes to appear.
Onstage
anddevelop
cron runs every 15 minutes.
Acquia provide the memcahed
libraries on its environments, and will configure special memory allocations for memcache on request.
Memcache modules are not enabled on the City of Boston Drupal 8 environments.
Images:
Static Content: (typically web-pages built from a Drupal content type)
When an entity (bit of content) is updated in Drupal its tags are invalidated. Pages which use that content (and which are are already cached by Drupal) are also invalidated. Next time that page is requested a rebuild/regeneration and re-cache occurs within Drupal.
When a page is invalidated in Drupal, Varnish is notified and the page is also invalidated in the Varnish cache.
Because Drupal caching and invalidation is now so effective, the page-expiry for nodes should be set to a large value (> 1 month). This is done in the APE configuration.
Dynamic Content: (typically REST end-points and web-pages built from, or containing Drupal views)
Views can be given a lifetime, and set to expire a certain time after the last time the views underlying query was run. As I understand, with time-based caching there is no invalidation of the node, but as the content expires it will be re-cached by Drupal using the traditional (Drupal 7) method. The page containing the view should be set to expire at a relatively short period (in APE) - around the same time value as the view cache expiry. Unless told otherwise Varnish expires the page after 2 minutes.
REST endpoints should be given an expiry in APE.
The Varnish cache performs 2 functions, one intended and one somewhat unintended.
Reduces load on the application server (i.e. webserver), but also
The cache will continue to serve cached pages even if the application server (webserver) is down or otherwise unavailable. Any cached pages in varnish will continue to be served until the pages expire in the cache. Note: Not all pages are cached, and authenticated sessions are not cached.
The configuration system for Drupal 8 and 9 handles configuration in a unified manner.
By default, Drupal stores configuration data in its (MySQL) database, but configuration data can be exported to YAML files. This enables a sites configuration to be copied from one installation to another (e.g. dev to test) and also allows the configuration to be managed by version control.
TIP: Configuration data (aka settings) includes information on how custom and contributed modules are configured. Think of configuration as the way developers define how the Drupal back-end functions, and what options will be available to content authors.
Configuration is very different to content. Content is information which will be displayed to website viewers in Drupal nodes. Content is also stored in the database, but is not managed by the configuration system.
Drupal has a built in configuration management system, along with drush CLI commands to import and export configurations.
Configurations are saved in a folder (the config sync directory) on the webserver hosting the Drupal website. This folder is defined in the settings array $settings['config_sync_directory']
which is defined in the settings.php
file. This folder is defined relative to the docroot
folder typically outside of the docroot for example:
drush cex
exports configurations from the the database into the config sync directory.
drush cim
imports configurations from the config sync directory into the database.
Module Exclusions: The Configurations for an entire module can be excluded from both of the drush cim / cex
processes by defining them in the $settings['config_exclude_modules']
array in the settings.php
file. For example:
WARNING / CARE: If you add modules into this list, then they will be removed from the core.extensions.yml
file during the next config export. This means these modules will be uninstalled/disabled on any environment in which these configs are imported.
As a rule of thumb - only add modules to this array that you wish to be removed for all environments other than the one you are developing on.
The Drush CLI is the main CLI utility and is installed and enabled on the CoB Drupal backend.
config:delete (cdel)
Delete a configuration key, or a whole object.
config:devel-export (cde, cd-em)
Write back configuration to module's config directory.
config:devel-import (cdi, cd-im)
Import configuration from module's config directory to active storage.
config:devel-import-one (cdi1, cd-i1)
Import a single config item into active storage.
config:diff (cfd)
Displays a diff of a config item.
config:different-report (crd)
Displays differing config items.
config:edit (cedit)
Open a config file in a text editor. Edits are imported after closing editor.
config:export (cex)
Export Drupal configuration to a directory.
config:get (cget)
Display a config value, or a whole configuration object.
config:import (cim)
Import config from a config directory.
config:import-missing (cfi)
Imports missing config item.
config:inactive-report (cri)
Displays optional config items.
config:list-types (clt)
Lists config types.
config:missing-report (crm)
Displays missing config items.
config:pull (cpull)
Export and transfer config from one environment to another.
config:revert (cfr)
Reverts a config item.
config:revert-multiple (cfrm)
Reverts multiple config items to extension provided version.
config:set (cset)
Set config value directly. Does not perform a config import.
config:status (cst)
Display status of configuration (differences between the filesystem configuration and database configuration).
Drupal is an alternative CLI and is installed and enabled on the CoB Drupal backend.
config:delete (cd)
Delete configuration
config:diff (cdi)
Output configuration items that are different in active configuration compared with a directory.
config:edit (ced,cdit)
Change a configuration object with a text editor.
config:export (ce)
Export current application configuration.
config:export:content:type (cect)
Export a specific content type and their fields.
config:export:entity (cee)
Export a specific config entity and their fields.
config:export:single (ces)
Export a single configuration or a list of configurations as yml file(s).
config:export:view (cev)
Export a view in YAML format inside a provided module to reuse in another website.
config:import (ci)
Import configuration to current application.
config:import:single (cis)
Import a single configuration or a list of configurations.
config:override (co)
Override config value in active configuration.
config:validate (cv)
Validate a drupal config against its schema
These are unique to the drupal CLI, rarely needed but can be useful for manually creating configs for cusom modules.
generate:entity:config (gec)
Generate a new config entity
generate:form:config (gfc)
Generate a new "ConfigFormBase"
generate:theme:setting (gts)
Generate a setting configuration theme
It is possible to override configurations in the php files on the Drupal back end.
Normally the configurations a developer will wish to override will be in a xxx.settings.yml file. This is where settings type configurations are defined and saved by contributed and custom modules.
The strategy to globally override a config setting for the entire Drupal site is to alter the $config
array in the settings.php
file.
Because the main settings.php
file can include different settings files for different environments, we can add global overrides to an environment-specific settings.php file to implement an override for only that environment.
TIP: Code in a settings.php file can be conditional, so the override can be made to be conditional on the value of a local (or environment) variable.
Example 1- Core config override: The system.maintenance.yml
file contains a message
key to control text that appears on the site maintenance page when shown. To override the message
key set in the system.maintenance.yml
file, place this in an appropriate settings file.
Example 2- Custom/Contributed Module config override: The salesforce.settings.yml
file supplied by the salesforce
module contains key to authenticate against a salesforce.com account in order to sync data. To override the consumer_secret
key set in the salesforce.settings.yml
file, place this in an appropriate settings file.
Override/Secrets Best Practice:
It is best practice not to save passwords and other secrets (incl API keys) in configuration files, as these will end up in repositories, and could be made public by accident.
Instead, passwords and other secrets should be stored as Environment variables on the Drupal web server, and then be set in an appropriate settings.php
file.
Example: recaptcha secret key saved as environment variable bos_captcha_secret
This means that passwords and other secrets are saved on the environment to which they apply so there is less (or no) need for environment-specific overrides.
It also means that all secrets are managed the same way, and can be changed on the environment and take effect immediately without needing to redeploy any code.
PHP commands retrieve a current configuration settings are as follows:
These commands will get the original config value, ignoring any overrides:
To assist with configuration management, there are a number of contributed modules.
The contributed modules are generally deployed to help manage situations where different configurations are desired on different environments.
Although this is not a contributed module, the use of .gitignore
allows a way to prevent configurations from making their way into repositories, and replicating upwards from the local development environments to the Acquia dev/stage/prod environments.
Simply add specific config files (and/or wildcards) to the .gitignore
file in the root of the repository.
Provided the files do not already exist in the repository, they will be ignored by git during commits and pushes from the local repository.
Example: .gitignore in repository/project root.
TIP: If you don't prefix the entry with any folder paths, then all occurrences of the file will be ignored. This includes files from config exports (drush cex
) and also from config_devel exports (drush cde
- see below.)
This module provides configuration import protection. If you are concerned that importing certain configurations when using drush cim
(which is used during a deploy) will overwrite existing configurations on a site, then config ignore will help prevent this.
Specific files to be ignored during an import can be added to the ignored_config_entities
key of the config_ignore.settings.yml
file. This array can also be overridden/extended by altering the $config['config_ignore.settings']['ignored_config_entities']
array in an appropriate settings file.
The .yml
extension is dropped and wildcards can be used to select entire modules, entities, etc:
ignored
_config_entities:
- salesforce.settings
- ...
- 'core.entity_view_display.node.metrolist_development.*'
Note: This module only provides protection when drush cim
is executed. When drush cex
is executed, the config_ignore settings are not considered and a full set of configs is still exported.
If you can't use $settings['config_exclude_modules']
(because you maybe only want to exclude just the module.settings.yml
file from a module) then use gitignore to stop it being committed to the repo and deployed.
CoB Local Development.
CoB use config_ignore
as a fail-safe protection.
Configurations that are set in the production system at runtime (usually settings) via the UI and are therefore different to the config in the ../config/default
folder are added to config_ignore so that they cannot be imported over the site settings should the files exist in the folder.
This module provides configuration separation. Configurations can be split into different folders and imported/exported independently.
Drush Command Summary:
config-split:activate
Activate a config split.
config-split:deactivate
Deactivate a config split.
config-split:export
Export only split configuration to a directory.
config-split:import
Import only config from a split.
config-split:status-override (csso)
Override the status of a split via state.
Config split can be used to create a number of different configuration sets which can be applied on different environments and/or at different times. This is an ideal way to control which modules are installed on which environments, and even to provide environment-centric settings (for settings controlled via config).
This module provides custom module configuration installation. If you anticipate your custom module will be used as a "contributed" module on another site - or will be enabled or disabled individually - then you will want to save its configuration into an install
folder inside the custom module.
Drush Command Summary:
config:devel-export (cde, cd-em)
Write back configuration to module's config directory.
config:devel-import (cdi, cd-im)
Import configuration from module's config directory to active storage.
config:devel-import-one (cdi1, cd-i1)
Import a single config item into active storage.
Find the full Git series at:
City of Boston use Acquia to host all non local (docker) servers on our .
Acquia's servers are contained within an subscription and implement a cache outside the load-balancer, as .
The release of Drupal 8 contains a using "". Drupal 7's cache expired items based on a lifetime for that item. Drupal 8 introduces another option called cache invalidation. This is where you set the cache lifetime to be permanent and invalidate (purge) that cached item when its no longer relevant. Drupal 8 does this by storing metadata about the cached item. Then, when an event occurs, such as an update on a node, the metadata can be searched to find all cache items that contain computed data about the updated node, and can be invalidated.
(for the purposes of this summary document) can be considered to be a low-level cache which optimizes caching by saving more dynamic process responses to memory. The principal value is to minimize requests between the Drupal kernel and MySQL for queries that are run multiple times during bootstrap and page requests.
Is fully independent from the Drupal kernel, and therefore is decoupled from Drupal -except for a purge module provided by Acquia which manipulates a Varnish API. - (Beware: notes are for Drupal 7) -
says in Acquia Cloud, pages are cached for 2 minutes by default.
Varnish will accept caching instruction from a web-page headers, so we use (APE) in drupal to send specific cache instructions to Varnish. The default caching time (set by APE) for CoB drupal pages is 4 weeks (i.e. overrides default 2minutes with 4 weeks!).
Drupal entities are cached using .
Drupal caching is managed by the Drupal kernel and the module (APE).
Views by default honor the tag generation and invalidation process whereby a view is cached with a tag, but the view invalidation model is not very refined (to refine the invalidation of views tags consider module - but (as of version 8.x.1.1) custom coding is required to implement). If a view is based upon the entity type node, then any change that invalidates a node tag will also invalidate the view. Although this causes (potentially) unnecessary invalidation of views, it is an effective way to ensure current content is returned from a view. If the view display is a page, then the invalidation the views does bubble up to Varnish (provided it is using a tag-based cache strategy).
See this .
This information is adapted from this , and contains more advanced techniques and discussion.
CoB custom modules - usually taxonomy, nodes and paragraphs.
The following development conventions are being followed in developing boston.gov modules.
City of Boston have the following naming and grouping conventions for custom modules:
Templates for the component should be saved in:
To add a customized template, select a suggestion for the base (node, field, region etc), then
Save the template in the folder above
In the module_name_theme()
hook in module_name.module
add the following:
If a new suggestion is needed, then add the following:
Where XXX is the appropriate entity type (node, field, region, etc etc) to add a suggestion to.
Wherever possible, the style provided from the patterns library should be used. In practice this means that boston.gov can be styled by a Drupal developer ensuring that the twig template files provide HTML structured and given classes that the patterns library expects.
Should the need arise, then the patterns library style sheets can be overridden. Typically this is done at the module level, although if multiple modules will use the override, consider placing it in the bos_theme
theme.
To add overrides,
Create the style sheet module_name.css
and appropriate markup in the relevant template (see above section),
Save the stylesheet in:
Update (or create) the module_name.libraries.yml
file with the following:
Using a module_name_preprocess_HOOK()
hook in module_name.module
attach the css where and only when it is required. For example:
Wherever possible, JavaScript should not be used on boston.gov. This is to maintain compatibility with as many browsers as possible, and to maximize accessibility for screen readers etc.
Should the need arise, then a JavaScript library can be created and deployed. Typically this is done at the module level, although if multiple modules will use the override, consider placing it in the bos_theme
theme.
To add overrides,
Create the JavaScript library module_name.js
,
Save the library in:
Update (or create) the bos_modulename.libraries.yml
with the JavaScript directive - for example you could add the following:
Using a bos_modulename_preprocess_HOOK()
hook in bos_modulename.module
attach the JavaScript library where and only when it is required. For example:
Drupal 8 defines settings and configurations in YML files with the actual "current" settings and configurations being stored in the Drupal (MySQL) database.
When the website is deployed or the web-server is restarted, configurations are re-read from the database. To reload the configuration and settings from yml files requires a manual (usually drush
) process to be run by a developer.
Clearing the sites caches causes cached configurations and settings to be replaced with values from the database. Clearing caches does not reload yml files.
YML files in a modulesdocroot/modules/custom/ ... /module_name/config/install
folder will be imported into the database when a module is first installed.
YML files in the docroot/../config/default/
folder will be imported into the database when the configuration is imported via the Drupal UI, or the drush
command.
Current (run-time) settings and configurations in the database can be exported to the docroot/../config/default/
folder via the Drupal UI, or the drush
command.
If the config_devel
module is enabled then a modules configuration can be exported to the modules config/install
folder.
The dependent configurations are defined in the module_name.info.yml
file as follows:
To export these configurations to the config/install
folder use the following drush command:
Modules should try to reuse field.storage.entity-type.field_name
configurations wherever possible.
field.storage.entity-type.field_name
configurations should be:
1. saved in the modules parent module (e.g. bos_components
or bos_content
)to enable sharing, and
2. added to the parents config_devel
section of the .info.yml
file.
Custom theme which presents the back-end UI to content authors and editors. .
Boston.gov use Drupal core workflow and moderation modules.
CoB use the following modules for moderation:
Content Moderation: [core] Provides moderation states for content.
Workflows: [core] Provides UI and API for managing workflows. This module can be used with the Content moderation module to add highly customizable workflows to content.
Moderation Note: [contrib] Provides the ability to notate elements of a moderated Entity.
Moderation Sidebar: [contrib] Provides a frontend sidebar for Content Moderation.
Modules can contain a single vocabulary taxonomy.
A good example module can be found at:
Module naming convention is to call the module vocab_moduleName
. The "moduleName" should be indicative of the taxonomy contained within the module.
Sub-pages in this section assumes an example module is to be namedvocab_module_name
- with module folder at:
Notes on bos_admin theme for UX when adding content via admin pages.
To keep a clear and clean editor experience which uniform across the site, the form display configuration for nodes will contain groups.
There will be a root (parent) group of type tabs
. This group will contain child groups of type tab
. Each tab group will contain the nodes fields.
Recommended Grouping Layout:
1. Required: Create a parent tabs
group called group_main
(the name is not important).
2. Create child tabs
groups with the following layout:
- Basic Information: Contains custom fields required by the new content-type,
- Sidebar Components: Single Entity reference revisions
field for sidebar paragraphs,
- Components: Single Entity reference revisions
field for main page paragraphs,
.. then other tab
groups is needed (try to minimise if possible).
The use of further nested groups is discouraged, except for grouping which occurs within paragraph components that are exposed in Components or Sidebar Components tabs.
If other groups are required to help clarify the form display, they should be details
type groups, and should be set to be collapsible, and be collapsed by default.
IMPORTANT:
For site consistency, ensure any and all Entity reference revisions
(i.e. paragraphs) on the node are set to "Paragraphs (EXPERIMENTAL)"
in the form display.
The bos_admin
theme makes some changes to the node administration forms.
Config settings provided by drupal core and drupal contributed modules are moved into a tab
called advanced, and are set as children of the tabs
group as defined above.
This manipulation is done in the hookbos_admin_form_alter()
found in bos_admin.theme
file at themes/custom/bos_admin
.
The moderation state, revision log note and save / preview / delete buttons are grouped together in a details group and moved to the right sidebar area of the administration form.
This manipulation is done in the hookbos_admin_form_alter()
found in bos_admin.theme
file at themes/custom/bos_admin
Developer notes for content type (node) design and implementation.
Modules can define multiple content types (nodes) grouped by similar function.
A good example module can be found at:
Module naming convention is to call the module module_name
. The "module_name" should be indicative of the node/s contained within the module.
Sub-pages in this section assumes an example module is to be named module_name
and therefore the module folder would be:
Drupal 9 (our current install) uses CKEditor 4, when we move to Drupal 10, it uses CKEditor 5. CKEditor 5 is currently installed in core/modules, but not used.
We currently have 2 or more versions of CKEditor we use plus extension of the plugin in 2 other components.
Our current Drupal version D9 uses the CKEditor 4 in modules/contributor folder.
Once we upgrade to Drupal 10, will need to move from CKEditor 4 to 5. That is because Drupal 10 does not use CKEditor 4.
Samples of CKEditor 5 we can explore to integrate/use can be found here:
Building Housing allows constituents to see a full inventory of projects and parcels managed by the city of Boston.
This Drupal App allows residents to browse all active housing, open space, commercial, and to be decided (TBD) projects. It also provides information on city-owned land for sale.
Residents can click search for a specific project and/or view a map of all projects. A drill-down into a project page displays goals, a timeline, photos, meetings and more.
Search for Projects on the Building Housing Map
View details of a Project
Auto-create Community Meeting Events from Sales Force on CRON (5 minutes)
Auto-create and update Projects from Sales Force on CRON (5 minutes)
Drupal
Sales Force
Google Maps API
node_buildinghousing (docroot/modules/custom/bos_content/modules/node_buildinghousing)
Views
Webforms
Geolocation
Salesforce
From the building housing landing page a user can click and open a map showing all the current projects. The map and sidebar list are both generated via Building Housing View.
Entry point: /buildinghousing (click show map)
Custom CSS:
docroot/modules/custom/bos_content/modules/node_buildinghousing/css/node_bh_landing_page.css
docroot/modules/custom/bos_content/modules/node_buildinghousing/css/views_bh_listings.css
Views Template: docroot/modules/custom/bos_content/modules/node_buildinghousing/templates/views-view--bhmaps--maplist.html.twig
Views Functions: docroot/modules/custom/bos_content/modules/node_buildinghousing/node_buildinghousing.views.inc
Custom Markers: docroot/modules/custom/bos_content/modules/node_buildinghousing/images
Building Housing allows constituents to see a full inventory of projects and parcels managed by the city of Boston.
Entry point: /buildinghousing (click show map)
URL pattern: /buildinghousing/{ProjectName}
Custom CSS: docroot/modules/custom/bos_content/modules/node_buildinghousing/css/node_bh_project.css
Field Templates: docroot/modules/custom/bos_content/modules/node_buildinghousing/templates/snippets
Field Formatters: docroot/modules/custom/bos_content/modules/node_buildinghousing/src/Plugin/Field/FieldFormatter
Helper Functions (Pre-process, alters, fields):
docroot/modules/custom/bos_content/modules/node_buildinghousing/node_buildinghousing.module
docroot/modules/custom/bos_content/modules/node_buildinghousing/src/BuildingHousingUtils.php
Parcel Map - settings
Uses BuildingHousingUtils->setParcelGeoPolyData() and Arcgis to set the Polygon geo data for the parcels
Photo gallery - settings
Building Housing Project Map (same as above Map feature)
Project information
Developer Information - Custom field in node_buildinghousing.module
Project Goals
Project Timeline - Altered field to combine other fields
docroot/modules/custom/bos_content/modules/node_buildinghousing/src/Plugin/Field/FieldFormatter/EntityReferenceTaxonomyTermBSPublicStageFormatter.php
docroot/modules/custom/bos_content/modules/node_buildinghousing/src/BuildingHousingUtils.php
Project Type - Custom field in node_buildinghousing.module
Contact information - Custom field in node_buildinghousing.module
Email sign-up - Custom field in node_buildinghousing.module
Feedback form - settings
The following Drupal Entities are created to warehouse/cache data which originates in Salesforce.
The following nodes have records which are populated (add, update and delete) by mappings between Salesforce and Drupal which are run each cron cycle.
The following taxonomies have been created and their list items are maintained manually by Drupal developers. Taxonomy items can be added and deleted as needed, but usually will need adjustments to code to work as required.
As at March 2023
Entity | Name | Description |
---|
node | bh_project | The primary content_type for a Building Housing Property. Contains meta data about the Project and links to updates, attachments, parcels etc. |
node | bh_update | Contains information about updates to a project. This includes certain status changes, attached documents, links to community meeting records and comments from CoB Project Managers to insert into the timeline. |
node | bh_meeting | Contains information about Community Meetings held by CoB with residents regarding Building Housing Properties. |
node | bh_parcel | The official parcel number and top-level info - with GIS coordinates for the parcel. |
node | bh_parcel_project_assoc | Possibly deprecated ? I believe the parcel# is now saved in the field_bh_parcel_id field of bh_project. |
node | bh_contact | Deprecated (no data) |
node | bh_account | Deprecated (no data) |
taxonomy | bh_project_stage | The overall project stage. (usually when project status = active). Linked directly from |
taxonomy | bh_project_status | The status of the Project. Linked directly from |
taxonomy | bh_funding_stage | The funding stage for the Project.
Linked directly from |
taxonomy | bh_project_type | The broad project type for the Project.
Linked directly from |
taxonomy | bh_project_update_type | Update type for a |
taxonomy | bh_property_type | The property type.
Linked directly from |
taxonomy | bh_public_stage | The Project stage as used in the timeline. Linked directly from |
taxonomy | bh_neighborhood |
taxonomy | bh_record_type |
taxonomy | bh_disposition_type | List of disposition types - for use in map. |
view | building_housing |
view | bh_maps |
view | building_housing_updates |
page | buildinghousing/[propertyname] | This is the landing page for information about a property.
This is a customized page for the node |
A generic form which is attached to email addresses found on boston.gov, and handles sending emails to those addresses.
A contact us form template is maintained (within script tags) in bos_theme/templates/snippets/contactFormTemplate.html.twig
and is included on every boston.gov (Drupal) page.
The patterns library contact form javascript function start()
(in scripts/components/contact.js
) is executed when a boston.gov (Drupal) page loads.
The start()
function scans the completed page looking for email addresses anywhere in the html being served. Essentially, it:
- replaces the default mailto
directive for each email address with a click event listener which will trigger the handleEmailClick()
function, and
- attaches a click event listener to the forms' submit button and calls the handleFormSubmit()
function when the user clicks the submit button on the contact form.
TODO: The handleEmailClick()
function is called once when the page has finished loading.
It should be extended to also run when ajax events return data to the page (since email addresses could be served by XHR/AJAX as well as traditional document events)..
When an email address is clicked on the page, handleEmailClick()
- copies the template form from the script tags,
- inserts the correct email recipient to a hidden field,
- inserts all this onto the page and displays the contact us form, and
- in the background makes an ajax request to /rest/email_token/create,
generating and saving a unique "session" token in the form.
When the submit button is clicked, handleFormSubmit()
validates the form, and then submits, along with an authorization token to /rest/email_session/contactform
.
The PostmarkAPI.php
in the bos_email
module provides a "guaranteed delivery" type service. It tries to send the email via the postMark API, and if it fails for some reason, queues the email for later delivery in a Drupal queue.
Drupal will retry the email until it is able to send it to the postMark service.
Email tracking with Drupal has been discontinued with the use of the PostMark service. All logging can be obtained from the PostMark UI/Console.
Emails which fail to send can be viewed in the email_contactform queue.
This requirement could be obselete, and a requirement from earler versions of the form. Can consider removing this "feature" and reverting to having the sender be the email address provided by constituent. That way the cob employee/recipient can simply reply to the email.
Emails are sent from an email address that is generated for each email sent. The format of the email address is:
{random_string}@contactform.boston.gov
We don't send the email from the original sender's email address as that could be a vector for an email spoofing attack.
The reply_to header of the email is set to the constituents email first and the unique address of the contact form second. When someone replies to the email the to address of the email is set to the constituents email first and the unique address second. This delivers two copies of the email. One goes directly to the constituent, and one to the contact form API. We log the response time using the copy that gets sent to the contact form API. Once the reply email is delivered to the constituent, further replies will be direct between the constituent and city.
The contact us email which is sent from postMark to the cob recipient is a plain text email.
The PostmarkAPI is capable of generating HTML emails. A nicer experience for cob staff would be to receive an html email.
As well as the tokens etc, we could consider introducing an IP lock for say 60sec after a contact form submission is made (the timer should be managed on the server side, inside the endpoint). It should not affect genuine users but would minimize the impact (and success) of flood attacks and relay exploits on the endpoint. If there were a rapid second submission from the same IPAddress, we would flash a warning back to the user to try again after 60 seconds.
At the moment, there is no confirmation of a successful submission other than the on-screen notification. We have the submitters' email address, so we could send an email confirming the submission (see next enhancement).
There is little in the way of validation of the email the submitter provides as a contact for responses.
The JavaScript validation process does checks that a "visually" valid email pattern is entered, (i.e. it is an email pattern string) but does not validate the email address exists.
If we send email confirmations, then we could use that process to determine if the email address is active and does not have temporary errors (mailbox full etc). Since we are using AJAX it would be simple to end the confirmation email, wait a period of time (say 10sec) and then query postMark to see if the email was delivered. If it was then return success, if not then flag to the submitter on the form. This would not be 100% effective because some errors take time to be reported, and we cannot wait too long during validation. The other two approaches above should still be implemented to try to filter out malicious actions, and to detect innocent errors before consuming email server resources.
This app uses the bos_email
provided services as .
@see
(needs updating)
2022 this ticket () added a second email address box to try to prevent typos in the email address. While it helped a lot, it is not 100% effective and some emails from innocent errors and malicious actors (e.g. spammers) still slip through.
2023 provided some modules which can check for DNS validity, disposable emails and also email blacklists. This would further help reduce invalid sender email addresses being provided.
The largest on-page component for a Building Housing Project record in Drupal is its timeline. The timeline visually displays a Projects past and projected events & information in a time ordered list.
The main code is contained here:
docroot/modules/custom/bos_content/modules/node_buildinghousing/src/Plugin/Field/FieldFormatter/EntityReferenceTaxonomyTermBSPublicStageFormatter.php
Various templates are defined to create each timeline article, one template per timeline type.
Timeline items are gathered from various places in the system.
- text posts are items in the field_bh_text_updates
field of the bh_update
entity. Tiles are inserted using the messages create date for ordering. If the message is updated in SF, it continues to use the original created date. If the message is deleted in SF it should be deleted from the timeline. Posts are sourced from Salesforce Chatter and imported using this synchronization.
- documents are file objects which are referenced as items in the field_bh_attachment
field of the bh_project
. Tiles are inserted using the create date of the attachment for ordering. If Drupal detects an update to the attachment in SF, it continues to use the original created date for ordering. Documents are sourced from Salesforce and imported using this synchronization.
- a single rfp tile is created if the field_bh_rfp_issued_date
field is not empty Tile is inserted using the date in the field_bh_rfp_issued_date
field. If the date is changed (or deleted) in SF, the rfp tile will be moved or removed from the timeline. RFP date is sourced from Salesforce as part of this synchronization.
- meetings are bh_meeting
objects which are referenced as items in the field_bh_update_ref
field of the bh_update
entity. Tiles are inserted using the meetings start date. If the meeting is updated in SF, meeting will move accordingly in the timeline. Meetings are sourced from Salesforce and imported using this synchronization.
- stages
Timeline icons are controlled by css and are defined in function getStageIcon()
in EntityReferenceTaxonomyTermBSPublicStageFormatter.php
The various stages are defined at items in the bh_public_stage
taxonomy
A drupal module that displays various pieces of information about the entered address.
The my neighborhood application is a Drupal component that can be added to any page on boston.gov. Currently, it is here: https://www.boston.gov/my-neighborhood
There are two Drupal endpoints associated with this application.one for receiving the updates
One for receiving the updated records on a nightly basis (updates): https://www.boston.gov/rest/mnl/update
One for receiving the full load of records: https://www.boston.gov/rest/mnl/import
This page has information on the status of Drupal import scripts that run nightly and once a month.
The data used in this application come from a variety of GIS data sources. This spreadsheet lists them all in addition to the workflow that brings each one into Civis.
These datasources are combined with the SAM address dataset in Civis. This workflow is the one that combines all the datasets.
Every night, the my neighborhood workflow runs and sends the any records that have updated or changed to boston.gov. Once a month on the 1st, the workflow sends the entire load of records to Drupal.
NOTES: There are two cards that show data which are hard coded and don't sync with Civis. They are the mayor's name in the "YOUR MAYOR" card found in "mnl_config.js" file and the "YOUR AT-LARGE CITY COUNCILORS" card found in "Representation.js" file.
In addition to the hard coded items, there too is a data dependency on ReCollect. We built an endpoint in Drupal "rest/recollect" found in Drupal module "bos_mnl" to query ReCollect API with the user's inputted address and returns the next trash and recycling date for that specific address.
Jonathan Porter (Analytics division of DoIT) is the best point of contact for the Civis portion. Matt McGowan (Digital division of DoIT) is the best point of contact for the Drupal REST information.
Product
Product requirements (historical and current)
Historical work that *potentially* isn't reflected above: https://drive.google.com/drive/u/0/folders/1QVSDn6CgJiMY7dbYEiKKDRRX1cYOEHmu
-- OUT OF DATE Feb 2023 --
Digital team is responsible for showing unofficial election results on Boston.gov.
On election night, an HTML file of partial results is generated during tabulation and copied to the cityofboston.gov’s web server. The contents of this file are inserted into the Boston.gov unofficial election results page via a client-side AJAX request.
Historically, election night traffic for these results has been more than cityofboston.gov was comfortably able to handle. We previously ran a job on Heroku that copied the page from cityofboston.gov and put it on an S3 bucket, and the Boston.gov page referenced that S3 version.
As of January 2019, we’ve modified both the source file and our Incapsula settings so that Incapsula caches all requests for the file for 60s. This let us turn off the Heroku job.
The Google reCAPTCHA v3 returns a score for each request without user friction. The score is based on interactions with your site and enables you to take an appropriate action for your site.
Install Google reCAPTCHA
Make sure to register reCAPTCHA v3 keys (Secret and Site) here.
Follow the installation and configuration instructions here to add reCAPTCHA to your Drupal site: https://www.drupal.org/docs/8/modules/recaptcha-v3/installation-and-configuration
Finally navigate to your site and set up form(s) that needs recaptchae added.
reCaptcha analytics
Url to view scores: https://www.google.com/recaptcha/admin/site/430165885
Environment variables
All variables are in Acquire: https://cloud.acquia.com/a/develop/applications/
To update environment variables properly, make sure to update private repo with the current variable name and then add to Acquire. Only add Api key to Acquire.
Make sure you add the "recaptcha_v3.settings" in gitignore.
Metrolist allows Boston residents to search for affordable housing. The Search and AMI Estimator experiences are a JS WebApp. The rest of the app is built in Drupal (this module), with the underlying data layer provided by Salesforce.
Another feature of this module is allowing property managers to list and update their Metrolist listings without directly contacting DND.
View details of a MetroList Development listing
Submit a new Development listing to DND (Sales Force)
Provide data about Developments via JSON API to the Search and AMI Calculator JS WebApp
AMI data tables
Auto-create and update Developments from Sales Force on CRON (5 minutes)
Drupal
Sales Force
Google Maps API
Postmark
bos_metrolist (docroot/modules/custom/bos_content/modules/bos_metrolist)
Views
Webforms
Geolocation
Salesforce
MetroList Development (/admin/structure/types/manage/metrolist_development)
MetroList Unit (/admin/structure/types/manage/metrolist_unit)
Availability Status
City
Due at signing
Features (MetroList)
Income Eligibility AMI Threshold
MetroList: User Guide Type
Neighborhoods (MetroList)
Occupancy Type
Region
Rent Type
Unit Type
Utilities Included
This feature shows details about a development and its aggregated units. Any new data is pulled from Sales Force on every CRON run. The display for the development page is mainly controlled by the Available Units View that is included via the Development display.
Entry point: /metrolist/search (Use the MetroList Search JS WebApp to find a Development)
URL pattern: /metrolist/search/housing/{Development Name}
Custom CSS: docroot/modules/custom/bos_content/modules/bos_metrolist/css/views_ml_block_avaliable_units.css
Views Template: docroot/modules/custom/bos_content/modules/bos_metrolist/templates/views-view-metrolist-drawers.html.twig
Views Style: docroot/modules/custom/bos_content/modules/bos_metrolist/src/Plugin/views/style/MetroListDrawersStyle.php
Helper Functions (Pre-process and alters): docroot/modules/custom/bos_content/modules/bos_metrolist/bos_metrolist.module
Location Information
View - Available Units - Header
Back to Search
Map - settings
Development Name
Address
Area
Unit Type
Share Listing
Unit Information
View - Available Units - Fields
The Unit entities are aggregated based on type, price, AMI, bedrooms and bathrooms and then related to the proper Development.
Application and lottery information
View - Available Units - Footer
Application type and link
Agent Contact Information
Listing Type Information
Affordable Housing Lean More CTAs
A paragraph from the MetroList landing page that is being included from as paragraph library item
Feedback Form
A webform used to collect user feedback and send an email to DND contacts
This feature allows project managers throughout the state to submit new developments to DND for review and then eventual listing on MetroList. The user starts by navigating to the MetroList listing form landing page. On this page they then enter their email in the MetroList listing request form. When that webform is submitted a new MetroList Listing form draft is crated and an email is sent to the user with a token link (48 hour ttl). THe user then clicks the privet link and are taken to the Listing form. The listing form then communicates with Sales Force on each step to ether pre-fill information about the user and their properties or create new objects on Sales Force using the custom webform submit handler. Once the form is submitted the new objects will ber created and emails sent to the user and DND.
Webforms:
MetroList Listing Request Form
MetroList Listing Form
Webform Handler: docroot/modules/custom/bos_content/modules/bos_metrolist/src/Plugin/WebformHandler/CreateMetroListingWebformHandler.php
Helper Functions (Pre-process and alters): docroot/modules/custom/bos_content/modules/bos_metrolist/bos_metrolist.module
Sales Force Connector (Uses Sales Force module): docroot/modules/custom/bos_content/modules/bos_metrolist/src/MetroListSalesForceConnection.php
This feature used Views REST Export to create a simple JSON API for the MetroList JS WebApp. The application is able to make a request with a number of parameters and receive back MetroList development information and AMI information.
API Paths:
/metrolist/api/v1/developments?_format=json
/metrolist/api/v1/ami/hud/base?_format=json
Views:
/admin/structure/views/view/metrolist/edit/rest_export_nested_1
/admin/structure/views/view/metrolist_ami/edit/ml_ami_hud_base
Views Style:
docroot/modules/custom/bos_content/modules/bos_metrolist/src/Plugin/views/style/MetroListSerializer.php
Helper Functions (Pre-process and alters): docroot/modules/custom/bos_content/modules/bos_metrolist/bos_metrolist.module
This feature uses a View to creat a couple of data tables that are then used on income restricted housing guide. It uses the same base data as the AMI API from the Income Eligibility AMI Threshold taxonomy.
Entry point: /income-restricted-housing-guide
Views:
/admin/structure/views/view/metrolist_ami/edit/ami_table_hud
/admin/structure/views/view/metrolist_ami/edit/ami_table_bpda
Views Field/Validator:
docroot/modules/custom/bos_content/modules/bos_metrolist/src/Plugin/views/field/AMIThresholdTerms.php
docroot/modules/custom/bos_content/modules/bos_metrolist/src/Plugin/views/argument_validator/TermAMIThreshold.php
Views Template: docroot/modules/custom/bos_content/modules/bos_metrolist/templates/views-view-table--metrolist-ami.html.twig
Helper Functions (Pre-process and alters): docroot/modules/custom/bos_content/modules/bos_metrolist/bos_metrolist.module
This feature allows Drupal entities to sync back and forth with Sales Force Objects via the Drupal Sales Force module. It is primarily used by DND to use the data and access that is already on DND's Sales Force server to automatically sync with the Boston.gov Drupal site. This is controlled by field mapping configurations in the Drupal Sales Force module. Currently, all syncing is scheduled to happen on Drupal CRON run, every 5 minutes, with only updated objects.
Sales Force Mappings:
MetroList - Development (/admin/structure/salesforce/mappings/manage/metrolist_development/fields)
metrolist_development --> Development__c
MetroList - Unit (/admin/structure/salesforce/mappings/manage/metrolist_unit/fields)
metrolist_unit --> Development_Unit__c
Sales Force Settings:
MetroList - Development (/admin/structure/salesforce/mappings/manage/metrolist_development)
MetroList - Unit (/admin/structure/salesforce/mappings/manage/metrolist_unit)
Sales Force Mapping Field: docroot/modules/custom/bos_content/modules/bos_metrolist/src/Plugin/SalesforceMappingField/RelatedTermStrings.php
Troubleshoot Sales Force connection issues
If Drupal and Sales Force are not connecting or syncing please check the Authorization from Drupal to Sales Force (/admin/config/salesforce/authorize/list). You may need to Re-auth or even make a new connection if you need to connect to a lower development or testing environment on Sales Force. If you need access to an instance contact DND's Sales Force developer/administrator.
If a single item is not syncing or if you need info about the Drupal to Sales Force connection you can view the list this admin page. If you edit the instance you then have the option to force pull or push the Drupal entity with the Sales Force Object. If there is an issue you should see an error message in the response. You can also find other useful info like timestamps and record ids.
This has been updated for Drupal 10, as at October 2023.
IDP: Identity Provider - The IDP is Ping Federate which is managed by the City's IAM team. The actual user accounts are stored in Microsoft Active Directory and other City managed domain services which are connected to Ping. The IDP's function is to provide the user "authentication" checking user credentials and reporting security roles - reducing or removing the SP 's responsibility for verifying access and maintaining the security element of user accounts.
SP: Service Provider - The SP can be thought of as as a series of authentication functions within an application. In most implementations, the IDP and SP communicate using a SAML protocol. On successful authentication with the IDP, the SP receive standard and authoritative message (usually via a SAML formatted cookie) defailing the users name, email and (ideally but optionally) the Users security roles/groups.
In the case of boston.gov the SP is Drupal, or more specifically the SamlAuth module in Drupal.
The SamlAuth module (config-UI here) manages the communication with Ping (IDP) and exchanges SAML messages during a standard SAML process.
The IDP is used only to authenticate the user and SamlAuth receives a SAML message back from Ping (IDP) containing the username (cn), firstname, lastname and email of the authenticated user. If Authentication at Ping fails, the user cannot login to Drupal.
SamlAuth maps user accounts from Ping with user accounts in Drupal, where the cn
from Ping equals the username
in Drupal. If no matching account exists, SamlAuth automatically creates an account in Drupal, and assigns the content_author
role.
Security Groups in Ping are not reported to Drupal/SamlAuth, so no attempt is made to synchronize roles between Ping and Drupal. Even though it is not done at this time, this can be simply implemented in the future. Therefore, changes to user groups in Ping and Drupal are independent and never synchronized.
Drupal requires an email address to create an account. The email address does not need to be on the boston.gov domain, and it does not need to be unique. Ping (actually the IAM authoritative record systems behind Ping) cannot guarantee to provide an email address, so SamlAuth has been modified to create a "fake" email address using the pattern username@boston.gov
when there is no email in the SAML response from Ping.
SamlAuth requires an x509 security certificate (issued from or by the IDP (Ping)) in order to verify authenticity of the SAML response, and to decrypt the SAML message contents. This x509 certificate is created and distributed by IAM and is stored in environment variables and accessed by SamlAuth via the Key module.
In COB, we usee 2 IDP providers - one for prod and one for all non-prod.
The prod IDP provider is accessed via to sso.boston.gov.
The non-prod IDP provider is the IAM test provider at sso-test.boston.gov.
Boston.gov users typically use Access Boston to login to boston.gov (Drupal).
The tile on Access Boston Production launches content.boston.gov (prod)
The tile on Access Boston Test launches d8-stg.boston.gov (stage).
To use SSO to login to any other Acquia environment, you cannot use Access Boston and need to use the saml/login link. For example:
Boston.gov dev => https://d8-dev.boston.gov/saml/login
Boston.gov UAT => https://d8-uat.boston.gov/saml/login
Boston.gov local => https://boston.lndo.site/saml/login
From time to time the IDP certificate will expire and need to be re-issued by the Identity and Access Management team.
You should request the IDP Metadata XML
from IAM. This info will come as a single file per environment - with the extension .xml. See the next box to work out which part of the metadata is the certificate.
For Acquia Environments:
Login to the Acquia Cloud Console
Navigate to the environment you wish to update, and select the variables
section
Update the environment variable with the new certificate (cut and paste it over the existing entry) and save.
Note: The change will be immediate on the environment.
For your Local Dev Environment:
Your local dev environment uses the IDP Metadata from the IAM Test environment.
Login to your local Drupal
On this page, edit the key you wish to update
On the key page, in the certificate box, paste the new x509 certificate
Save the page.
Tip
You can check the expiry of the IDP certificate by clicking the info
button the key entry console.
Metrolist allows Boston residents to search for affordable housing. The Search and AMI Estimator experiences are built in React (this repository). The rest of the app is built in Drupal, with the underlying data layer provided by Salesforce. The core UX is composed of the following:
Homepage Links to Search, AMI Estimator, and introductory information. Route: /metrolist/ Controlled by: Drupal Search Lists housing opportunities in a paginated fashion and allows user to filter according to various criteria. Route: /metrolist/search Controlled by: React APIs in use: Developments API AMI Estimator Takes user’s household income and household size, and calculates a recommendation for which housing opportunities to look at. URL: /metrolist/ami-estimator/ Sub-routes:
/metrolist/ami-estimator/household-income
/metrolist/ami-estimator/disclosure
/metrolist/ami-estimator/result
Controlled by: React APIs in use: AMI API Property Pages Route: /metrolist/search/housing/[property]?[parameters] Controlled by: Drupal Developments API Lists housing opportunities as a JSON object. URL: /metrolist/api/v1/developments?_format=json AMI API Lists income qualification brackets as a JSON object, taken from HUD (Department of Housing and Urban Development) data. URL: /metrolist/api/v1/ami/hud/base?_format=json
Prerequisites:
Node.js
Yarn or NPM (These docs use yarn
but it can be substituted for npm
if you prefer.)
Git
Read/write access to CityOfBoston
GitHub
⚠️ Warning: These docs were written for a standalone installation of the Metrolist React codebase, which outputs JavaScript files that can be committed to the Drupal monorepo separately. However, the React codebase has since been subsumed into the monorepo, rendering certain build instructions herein out-of-date. Please refer to the Boston.gov documentation for further instruction.
yarn start
runs:
ipconfig getifaddr en6
(or ipconfig getifaddr en0
if en6
isn’t found), which determines which LAN IP to bind to. This allows testing on mobile devices connected to the same network.
webpack-dev-server
. This compiles the ES6+ JavaScript and starts an HTTP server on port 8080 at the address found in the previous step.
Note: The ipconfig
command has only been tested on a Mac, and it also may not work if your connection isn’t located at en6
or en0
.
This runs webpack-dev-server
without launching a new browser window automatically.
There are Node.js scripts available under _scripts/
to aid development efforts.
Located at _scripts/component.js
, this facilitates CRUD-style operations on components.
This copies everything under _templates/components/Component
to src/components/Widget
and does a case-sensitive find-and-replace on the term “component”, replacing it with your new component’s name. For instance, this index.js
template:
…becomes this:
Subcomponents can also be added. These are useful if you want to encapsulate some functionality inside of a larger component, but this smaller component isn’t useful elsewhere in the app.
This creates the directory src/components/Widget/_WidgetGadget
containing this index.js
:
As you can see, the hierarchical relationship between Widget and Gadget is reflected in the naming. The React display name is WidgetGadget
, and the CSS class name uses a BEM element gadget
belonging to the widget
block, i.e. widget__gadget
.
This renames the directory and does a find-and-replace on its contents.
⚠️ Known issue: The component renaming algorithm does not fully find/replace on subcomponents.
Due to compatibility issues with Google Translate, the AMI API is not fetched live from the AMI Estimator. Instead, it is fetched at compile time using this script, which caches it as a local JSON file at src/components/AmiEstimator/ami-definitions.json
.
The domain from which this data is fetched can be specified with the following environment IDs:
www
or prod
→ https://www.boston.gov
Acquia environment
dev2
→ https://d8-dev2.boston.gov
etc.
The default value is ci
, as that should have the most recent data set in most cases.
Sets the version number for Metrolist in Drupal’s libraries.yml
file and this project’s package.json
file.
Prefer readability for other developers over less typing for yourself.
HTML/CSS:
JavaScript:
Consistent and readable JavaScript formatting is enforced by eslint-config-hughx
+ an ESLint auto-formatter of your choice, such as ESLint for VS Code.
Use Functional Programming principals as often as possible to aid maintainability and predictability. The basic idea is for every function to produce the same output for a given set of inputs regardless of when/where/how often they are called. This means a preference for functions taking their values from explicit parameters as opposed to reading variables from the surrounding scope. Additionally, a function should not produce side-effects by e.g. changing the value of a variable in the surrounding scope.
metrolist/
__mocks__/
: Mocked functions for unit/integration tests.
_scripts/
: CLI tools
_templates/
: Stubbed files for project scaffolding. Used by CLI tools.
coverage/
: Code coverage report. Auto-generated. (.gitignore
’d)
dist/
: Build output. Auto-generated. (.gitignore
’d)
public/
: Static files such as images, favicon, etc. These files are not used by Drupal, which uses its own tempalting; only in development. Thus, images have to be copied to the appropriate directory prior to deployment.
src/
: React source.
components/
: React components.
globals/
: SASS variables, mixins, etc. which are used cross-component.
util/
: Utility functions.
index.js
: React entrypoint.
index.scss
: App-wide styles. (Use sparinginly; prefer component-scoped.)
serviceWorker.js
: Service Worker code from Create React App; not currently used.
setupTests.js
: Jest configuration.
_redirects
: Netlify redirects.
.env
, .env.development
, .env.production
: Dotenv configuration (environment variables).
.eslintrc.js
: ESLint configuration.
.travis.yml
: Travis CI configuration.
babel.config.js
: Babel configuration.
DEVNOTES.md
: Notes taken during development.
package.json
: Project metadata/NPM dependencies.
postcss.config.js
: PostCSS configuration. Used to postprocess CSS output.
README.md
: Project documentation (this file).
webpack.config.js
, webpack.production.js
, webpack.staging.js
: Webpack configurations for different environments.
yarn.lock
/package-lock.json
: Yarn/NPM dependency lock file.
Every React component consists of the following structure:
Component/
__tests__
: Integration tests (optional)
Component.scss
: SASS styling
Component.test.js
: Unit test
index.js
: React component
methods.js
: Any methods that don’t need to go in the render function, for tidiness. (optional)
All classes namespaced as ml-
for Metrolist to avoid collisions with main Boston.gov site and/or third-party libraries.
Vanilla BEM (Block-Element-Modifier):
Blocks: Lowercase name (block
)
Elements: two underscores appended to block (block__element
)
Modifiers: two dashes appended to block or element (block--modifier
, block__element--modifier
).
When writing modifiers, ensure the base class is also present; modifiers should not mean anything on their own. This also gives modifiers higher specificity than regular classes, which helps ensure that they actually get applied.
An exception to this would be for mixin classes that are intended to be used broadly. For example, responsive utilities to show/hide elements at different breakpoints:
Don’t reflect the expected DOM structure in class names, as this expectation is likely to break as projects evolve. Only indicate which block owns the element. This allows components to be transposable and avoids extremely long class names.
Avoid parent selectors when constructing BEM classes. This allows the full selector to be searchable in IDEs. (Though there is a VS Code extension, CSS Navigation, that solves this problem, we can’t assume everyone will have it or VS Code installed.)
Always include parentheses when calling mixins, even if they have no arguments.
Don’t declare margins directly on components, only in wrappers.
Rucksack is installed to enable the same CSS helper functions that are used on Patterns, such as font-size: responsive 16px 24px
.
Currently this is used for previewing on Netlify, to get a live URL up without going through the lengthy Travis and Acquia build process.
This first runs a production Webpack build (referencing webpack.config.js
), then copies the result of that build to ../boston.gov-d8/docroot/modules/custom/bos_components/modules/bos_web_app/apps/metrolist/
, replacing whatever was there beforehand. This requires you to have the boston.gov-d8
repo checked out and up-to-date one directory up from the project root.
To make asset URLs work both locally and on Drupal, all references to /images/
get find-and-replaced to https://assets.boston.gov/icons/metrolist/
when building for production. Note that this requires assets to be uploaded to assets.boston.gov
first, by someone with appropriate access. If you want to look at a production build without uploading to assets.boston.gov
first, you can run a staging build instead.
This is identical to the production build, except Webpack replaces references to /images/
with /modules/custom/bos_components/modules/bos_web_app/apps/metrolist/images/
. This is where images normally wind up when running yarn copy:drupal
.
Aliases exist to avoid long pathnames, e.g. import '@components/Foo'
instead of import '../../../components/Foo'
. Any time an alias is added or removed, three configuration files have to be updated: webpack.config.js
for compilation, jest.config.js
for testing, and .eslintrc.js
for linting. Each one has a slightly different syntax but they all boil down to JSON key-value pairs of the form [alias] → [full path]. Here are the same aliases defined across all three configs:
webpack.config.js
:
jest.config.js
:
.eslintrc.js
:
All mailto:
links require the class hide-form
to be set, otherwise they will trigger the generic feedback form.
We’re using Jest + React Testing Library to ensure that future development doesn’t break existing functionality.
Every component should have its own unit test in the same directory. This is enforced by the Component test stub (_templates/components/Component/Component.test.js
), which contains the following:
So when running yarn component add
, you automatically generate a test that fails by default. You have to manually uncomment the call to render
(and ideally write more specific tests) in order to pass. This is designed to be annoying so it isn’t neglected.
When testing interactions between two or more components, or for utility functions (src/util
), put tests in a nested __tests__
directory.
One example of this is the Search
component, which contains a separate test file for every FiltersPanel
+ ResultsPanel
interaction,:
You have to run a browser without CORS restrictions enabled. For Chrome on macOS, you can add this to your ~/.bash_profile
, ~/.zshrc
, or equivalent for convenience:
This will prevent you from running your normal Chrome profile. To run both simultaneously, install an alternate Chrome such as Canary or Chromium. For Canary you would use this command instead:
Then in a terminal, just type chrome-insecure
and you will get a separate window with no security and no user profile attached. Sometimes Google changes the necessary commands to disable security, so check around online if this command doesn’t work for you. Unfortunately no extensions will be installed for this profile, and if you install any they will only exist for that session since your data directory is under /tmp/
.
We’re using React Router for routing, which provides a Link
component to use in place of a
. Link
uses history.pushState
under the hood, but this will fail inside the Google Translate iframe due to cross-domain security features in the browser. (For an in-depth technical explanation of why this happens, see DEVNOTES). So in order to make app navigation work again, we have to hack around the issue like so:
Change base.href
to the Google Translate iframe domain,
Perform the navigation,
Change base.href
back to boston.gov immediately afterward to make sure normal links and assets don’t break.
To do this automatically, there is a custom Metrolist Link
which wraps the React Router Link
and attaches a click handler with the workaround logic. So, anytime you want to use React Router’s Link
, you need to import and use @components/Link
instead. This is the technique used by the Search component to link to the different pages of results.
If instead you want to use React Router’s history.push
(or the browser-native history.pushState
) manually, you can import these helper functions individually:
This is the technique used by the AMI Estimator component to navigate between the different steps in the form.
Boston.gov integrates with many services across the City. Here is a run down of the methods for integrating with the website.
Some things to keep in mind when building against an API:
Try to use existing modules when possible. We currently utilize a Salesforce module to bring data in for MetroList.
Use migrations if possible to store data locally when the data doesn’t change frequently and isn't likely to grow exponentially. This allows us to use Views within the sites.
Boston.gov provides some APIs of its data. Currently, the following APIs are available:
The next five upcoming events can be retrieved in JSON format from /api/v1/upcoming-events
.
All upcoming public notices can be retrieved from /api/v1/public-notices
. This is rendered in JSON format.
Blank layouts that can be used by external applications exist at /api/v1/layouts
. Currently, only /api/v1/layouts/search
is available, but more will be added.
The blank layouts should be used by external applications to provide wrapper HTML. This will also provide the necessary updates to navigation as things change at Boston.gov.
In the search example, we include the necessary <%= yield %>
tag that is used by our Rails based search application.
API Feed for Status Items from boston.gov.
GET
/api/v1/status_items
Status Items are information cards which are maintained by the website Content Editors to advise constituents on the status of various (usually physical) city services.
Conceptually, a status item relates to a city service (parking, street sweeping, city office opening hours etc) and contains a set of messages that can be displayed at different times.
The only place we presently display status items is on the homepage.
The API is consumed by the ConnectedBits mobile App, which calls the API periodically to get an update on the status items. The ConnectedBits App checks for added/deleted status items and the updated_at
field from the API and alters content on the App as necessary. The response format is customised to meet requirements provided by ConnectedBits in this document.
The main status item entity is a node called status_item
. The status_item node contains:
a title field which is the name of the status item,
a link to an icon,
a field to enable/disable the status item, and
a collection of messages to show for the status item.
Each message in the collection is a message_for_the_day paragraph entity which contains:
the text for the message and
information on when the message will be displayed.
The API is implemented using a view display (view:status_items
- display: bos_311_motd_api
[Bos311]). The view handles the selection and filtering of the information that will be provided by the API.
There is a small amount of pre-processing code in the node_status_item.module
and bos_messages.module
to validate and de-duplicate records to be usre the output matches that which is displayed on the site at the time the API is called..
There is a Drupal View Formatter (a style plugin) Bos311Serializer.php
which is used to re-organize the fields once the view has completed processing. This class does not filter or otherwise alter the view output other than ensuring the json result is organized in the format required by the ConnectedBits specification document (above).
The
Technical and project documentation for charts on boston.gov.
In general, content on boston.gov should be at or below an 8th grade reading level. In an attempt to be in accordance with this and keep our site as easy-to-understand as possible, we push for simple and straight forward charts (e.g. bar charts, line charts, and sometimes pie charts if we have to).
Coordination with both the boston.gov Content Manager (for chart placement) and the Analytics Team (for chart data) will be required to get a chart on boston.gov.
The general overview for getting a chart on boston.gov is:
Get some data
Work with the Analytics Team to get it into a public s3 bucket and set up an automated workflow for it getting updated (if appropriate).
Grab the schema from one of the charts below that looks/is most like the one you want to build
Add your chart to boston.gov through Drupal.
Vega/VegaLite can read in data from a publicly available url. We leverage this functionality so that we can set up separate automation practices for the data charts on boston.gov display. This helps ensure the data on the chart stays up-to-date.
The Analytics Team has the ability to create s3 buckets that can store completely public csv files. Once a file has been loaded to a public bucket, you can supply the "Object URL" to Vega.
Drops the data into Civis' Postgres database
With this workflow, we can ensure that when the Budget Office needs to make a change on a chart on boston.gov, they only have to update the agreed upon spreadsheet. More detailed documentation on this process.(add link)
To create the charts, we use a json schema created an maintained by the authors of Vega/VegaLite. We primarily rely on VegaLite as it is easier to use than Vega. The only time Vega is used is to create a pie chart.
A good place to get started for an overall understanding of the libraries is their docs:
The <cob-chart>
web component supports an additional section of the schema called "boston". This section supports three inputs specific to our uses cases:
"chartID" - string, optional: unique id of the chart
"defaultSelection": string, optional: value of a drop down's default selection
Below are examples of how each specific type of chart gets implemented at the City of Boston. Links to example schemas are given with call outs of particularly interesting functionality the chart contains.
The height of the chart defined in the schema will be respected and used when the chart is rendered as a web component on a boston.gov page.
The width of the chart will be overwritten based on the screen size the chart is rendered on if "autosize": "fit"
is in the schema. This helps ensure a chart fits on both large and small screens.
To achieve that, we add "sort"
and define the field, operation, and order of the bars to the axis definition we want sorted.
If you are using a selection, the chart will work if you do not define tooltips, so work with the data owner to ensure field names are appropriate for displaying. An example of this is shown in the "Line Chart with Selection" tab above.
The height and width on pie charts should pretty much always be set to 200px. This ensures that the chart fits on all screen sizes.
If the height and width are set to more than 200px, the chart will resize to best fit the container it is loaded in, but the center of the chart will not change if the screen is resized, so it may get cut off if the user adjusts their screen size.
Data from a URL can be parsed to ensure it is read as the correct data type when Vega reads it. This is a good idea to ensure your chart will perform as expected and any issues with the data input will get turned to null
values instead of the entire column being read as a string.
This can be extremely helpful as Departments may often store and/or conceptualize their data in the "wide" format, while charting libraries and data nerds like data in a "long" format.
Now that our data is formatted correctly, we can use the "Type" column to color our chart so we get three lines showing the trends in the different types of State Aid.
TDB
SamlAuth:
Keys:
Depending on the level of complexity, a restful API can be used to integrate with Boston.gov. Currently, the CityScore module connects to an API to display data on the page.
We use the and libraries to create our charts on boston.gov. VegaLite is an easier to use version of Vega. Both libraries are built on top of . It does a little more guess work for us so we don't have to be so specific in defining our charts. These libraries let users define/create a chart using a JSON schema.
From a technical standpoint, we wrap this library up as a web component that is stored in , our patterns library. The <cob-chart>
component takes a JSON object as input in a Vega or VegaLite schema as input and creates a chart.
From a functional standpoint, when putting charts on boston.gov, we place them within text components in an effort to bring context to the data and information the chart is displaying. The has good examples of how charts should be used with narrative explanation.
Drop it into the to start editing it to use your data
Check that your chart works/looks okay using the .
For testing purposes, you could upload a csv as a and use that url, but when things move to production, all data should be pushing/pulling from an s3 bucket.
Storing data for charts on s3, means we can set up separate workflows that will automatically update the data on the chart. For example, for the , we set up a workflow in (the Analytics Teams data wharehouse and ETL platform) that:
Pulls data from a the Budget Office updates
Uploads the tables as csv to s3 (e.g. ), replacing existing files of the same name with new information.
The Vega libraries can do much more than what we've implemented at the City (e.g. create , , , , .). The charts web component was built and tested for only the following charts:
Each of the above charts can be built with one selection element as well. The selection element can only be built off one field and will always appear as a drop down box.
When developing a chart, it will likely be easiest to start using the . Once the chart has largely been built, the is helpful to make sure the chart will still work when rendered as a web component and for making sure fonts look okay.
"minWidth"- integer, optional: minimum width in pixels of the chart. Due to data constraints, some charts will not be readable on smaller screens. In this case, we can supply a minimum width to the chart and users will be able to horizontal scroll on any screen smaller than the minimum width supplied. .
Drop the JSON from the file linked below in the to see the chart below ():
Drop the json linked below into the to see the chart below ():
If you want the colored sections of bar charts to display in a specified order, you can use the VegaLite to create a new field in the data. In the example below, this field is called "barOrder"
.
You can then use that field to so that the number of Community-Based Organization seats is always on top.
You can use the of the schema or an axis to sort by various metrics. For example, we may want to have the longest bars at the top of a chart on a bar chart.
Drop the JSON from the file linked below in the to see the chart below ():
Drop the JSON from the file linked below in the to see the chart below:
Due to a in the version on VegaLite we're using, defining your own tooltip with a "line" mark will break the chart.
When you are not using a selection, you can get around this by a line mark and a point mark on top of each other in the chart and defining the tooltip in the point mark's encoding. Use the line chart schema as an example of how to do this.
Drop the JSON from the file linked below in the to see the chart below ():
Drop the JSON from the file linked below in the to see the chart below:
The main difference between a grouped bar chart and a regular bar chart is that we add a section, where we define the field to group by and style the header, to the schema's "encoding"
.
The COB web component was not created/tested for supporting , we only support grouping by .
Pie charts are the only chart type the <cob-chart>
component supports that use the .
Drop the JSON from the file linked below in the to see the chart below ():
Drop the JSON from the file linked below in the to see the chart below:
Vega supports both a an inputs to their "scale"
encoding. This means you can assign individual values to specific colors so they stay consistent as a user interacts with them. This is particularly helpful on charts with a selection in which not every selection will have all the values shown.
For example, we use domain and range to ensure the on budget.boston.gov always uses the same colors for the seven difference types of Operating Budget spending.
While this is helpful, number fields still need to be formatted without commas in the source data. Even when explicitly parsed as a number, . You'll have to work with departments or the Analytics Team to make sure the csv's getting pushed to s3 do not have commas in numeric fields.
Vega and VegaLite support a that allows you to reshape your data from .
Vega and VegaLite both support allowing you do things like putting a dollar sign in front of money values and .
When working with large numbers, you can use Vega's to divide the numbers on an axis by some amount (e.g. a million) so that displaying them on a y-axis becomes easier. You can then use the original field for displaying the amount in tooltips.
.
This has been fixed in , but that release also refactors the which broke our charts in a bunch of other ways, so yours truly decided to save fixing this for "later".
- should always be sized to 200px
, we only supported vertically grouped bar charts that use .
,
The is stored in Fleet, our patterns library.
Option
Description
-m
, --major
Sets the left version part, e.g. 2.x.x. If omitted, major will be taken from existing Metrolist version.
-n
, --minor
Sets the middle version part, e.g. x.5.x. If omitted, minor will be a hash of index.bundle.js for cache-busting.
-p
, --patch
Sets the right version part, e.g. x.x.3289. If omitted while minor is set, patch will be a hash of index.bundle.js for cache-busting. If omitted while minor is not set, patch will not be set.
-f
, --force
Allow downgrading of Metrolist version.
--help
This screen.
API Guide / listing of all end points provided by boston.gov (Drupal 8).
There is a requirement to provide partially completed, fillable official City of Boston PDF forms to constituents on demand in Q3 of each financial year.
In 2023 for Q3 FY2023, this was migrated into Drupal because the existing PDF solution (a compiled .net solution using a third party iText component) was not available.
A solution was needed where a form would be filled out and returned to the user using "tax bill" data from the current assessing database on VSQL01.
The pdf should be available as a "GET" download with a "parameterized" url so it can be used as a dynamic link in webpages.
An endpoint was written using PHP in the Drupal framework, utilizing the PDFManager module.
A get endpoint was designed which has a url format of:
The return is the correct form for the form-type (a pdf), completed with information from the property defined by the parcel_id. The file is downloaded as an application/pdf.
The first link provides a flat file which is not fillable, and the second (v2) is a fillable PDF.
Public Notices are contributed directly into Drupal by Content-Editors. This endpoint provides filtered and sorted lists of Public Notices.
GET
https://www.boston.gov/api/v2/public-notices
This endpoint provides a listing of the next 30 public notices which started in the last 3 hours or will start at some time in the future.
Response Notes:
All response fields are always strings.
Blank (empty) fields are still provided as key:value pairs, with the value being an empty string ("") - the API does not use the keyword "null".
Title, body and field_drawer may contain basic HTML mark-up.
Chart Type | Vega or VegaLite |
VegaLite |
VegaLite |
VegaLite |
Vega |
Cityscore is a CoB city-performance metric devised by the Mayors office, calculated and managed by the analytics team.
Drupal (via https://www.boston.gov) is used to provide a public endpoint or micro service which can be used by other departments or external organizations to retrieve current cityscore data for use in their own applications.
POST
https://www.boston.gov/rest/cityscore/load
This secure endpoint is used by analytics to load and update the current cityscore data.
Payload Format.
GET
https://www.boston.gov/rest/cityscore/json
This public endpoint returns the latest cityscore indicator value.
GET
https://www.boston.gov/rest/cityscore/html
This public endpoint returns an HTML string which contains a cityscore metric table using the CoB style.
Every week some time should be dedicated to reviewing the status and general health of Drupal and modules.
It is not necessary to complete all steps each week. In some weeks the module update will consume 100% of the time available, and other weeks there will be no module updates required, or the decision is made not to update any modules this week.
In general, the maintenance process should start with the review of module updates, as this is a cornerstone of any maintenance program and ensures the balance of issues are being addressed on a completely updated site.
There are many ways this task can be effectively completed. This way has the advantage of being responsive and relatively low-burden between cycles.
If the basic parts of this process are not conducted weekly, then to ensure the site is kept up-to-date and is adequately patched, there will be a need for monitoring of Drupal/Acquia channels to determine when various patches are available.
Login to the production website at https://content.boston.gov and then navigate to the available updates report page at /admin/reports/updates. All available updates which managed by composer are listed on this page.
Following Drupal Best Practices, the downloading and version control of contributed modules (including Drupal core itself) is managed by composer.
Note: Composer is not fully installed on any Acquia server (and does not need to be). Package/Module admin is done by running composer commands locally in development containers.
Compare the version rules in your local composer.json
with the versions listed on the status reports page in Drupal. If necessary, change rules in composer.json
so that the recommended module version will be downloaded.
A useful reference for writing versioning rules in composer can be found here.
Before you start...
Be sure your git working tree is clean and you have pulled the latest develop
branch from GitHub.
Run drush cim
on the develop branch to be sure you have the latest yml files imported into your local DB
Run drush cex
to be sure that all configurations are exported
(if you find configurations are exported then you will need to rebuild your develop environment - or at least sync the local database with the one on Acquia develop - use lando drush sql:sync @bostond8.dev @self
).
Create a new branch e.g. maintenance-03-03-2020
Run composer update in your local development container.
Check the output of the composer command, and verify that the modules/packages you expect have been added/remove/updated and that patches have been applied without errors. If necessary fix the errors by updating the composer.json
file and re-running lando composer update
.
Tip: You can see what changes will actually be made when updating by first running:lando composer update --dry-run
and comparing the output with the status reports page to see that the required modules have been added, removed and updated.
(Note: you can expect composer to update a considerable number of modules which are not listed on the status report because composer also checks dependencies and symfony components that Drupal does not)
You may see patches for a module fail after the version is updated.
This will either be because the patch is no longer needed (you can probably work this out by checking the modules release notes), or because the file is altered and the patch cannot find the anchors it needs to apply the patch.
In the latter case, you may need to modify the patch or check the issue related to the patch and see if there is an updated patch you can apply.
If you see this message:
Writing lock file Generating autoload files
In CopyRequest.php line 91:
fopen(/app/docroot/sites/default/default.services.yml): failed to open stream: Permission denied
Then you need to change the permissions on your local (i.e. on your host not in the docker container)docroot/sites/default
folder to 777, and then rerun the composer command.
After the update is completed, update the Drupal caches (so that the Drupal registries are updated).
Then, see if the updated modules need to apply any updates to the database. This step is done to ensure that the updates will apply properly on the Acquia servers when the deploy scripts run this command.
[Optional] We can export the configuration. It is unlikely that there will be changes to configuration, but it is theoretically possible, so to be safe this is a recommended step. Take care with this and ensure that any configuration changes are changes you expect. You should expect configuration updates only to .yml files from contributed modules updated by composer during this maintenance cycle.
Finally, commit into the main public d8 repository:
- thecomposer.lock
and (if changed) thecomposer.json
files -it is very important to include both (see "final note" box-out below), and
- any config files generated from drush cex
-these will always be in the /config/default
folder.
again, take care to check these over, changes to settings files (in particular) could affect production settings.
COMPOSER Command Overview:
composer update: [notes] reads the composer.json
file, works out the package versions which meet all rules (and recursive dependencies), compares versions with the existing packages in the local environment and downloads the packages and dependencies which need updating.
Finally, it updates composer.lock
with the exact versions of each package that are currently installed.
composer install: [notes] reads the composer.lock
file, compares to existing packages and downloads those packages and dependencies which need updating.
If there is no composer.lock
file found then composer will read the composer.json
file and essentially run the composer update
process.
SUMMARY: Recommended overall composer strategy:
- Manually maintain the composer.json
file locally, and usecomposer update
in the local container to update the composer.lock
file, then
- Commit and merge both the composer.lock
and composer.json
files into the main repository, then
- Deploy scripts will execute composer install
(on Travis) during the site build process so the exact same versions of all packages are deployed on all Acquia servers.
Side Note:
- The local Lando build script scripts/local/lando-build-drupal.sh
which is executed by lando start
, lando restart
, lando build
and lando rebuild
uses composer install
. This is to ensure that the package versions installed locally are the same as the currently loaded package versions on the dev
Acquia server.
- Travis scriptscripts/deploy/travis_build.sh
(which builds the full Drupal file system that is deployed to Acquia) also uses composer install.
This is also to ensure that the package versions loaded into the build/deploy artifact are the exact same package versions as in the local development container environment.
Final note (!):
Therefore, the only way packages will be updated on the Acquia servers (ci/uat/dev/stage or prod) is if a developer runs composer update
on their local container, and then merges the resultant composer.lock
file to the develop
branch of the main repo.
- If the composer.lock
file is not merged, then no update will occur.
- It is not necessary to merge changes to the composer.json
to complete an update. However, the composer.json
file still should be committed because if it is not, then it will be out of sync with the current lock file any changes to rules will not be available to another developer.
After Step 1 has been completed, and all module updates have been applied through to production the status report can be checked. Pay attention to the Errors and Warnings sections of this report.
Configuration changes may need to be made to clear warnings and/or errors.
Don't make changes in the UI of an Acquia hosted site - these changes should be made to a development version in a local container, the configs exported and merged into the main repository and then deployed in the normal fashion.
Acquia provide some useful diagnostic information on their Cloud UI Console. Much of the information in the Insight reports are taken from Drupal processes behind steps 1-2 above, so checking those areas should be done first.
Login to the Acquia Cloud Console
Navigate to the Drupal 8 Production environment.
Click on the "Insight" menu item to view a list of config setting mods recommended by Acquia. Note: Acquia Insight provides fairly generic best-practice type comments and it may not be appropriate to adopt/change all of the settings recommended - do some research first, and use common sense.
Click on Stack Metrics to view a series of graphs related to resource consumption and basic network performance. Take a look over these, paying particular attention to disk space and memory utilization graphs.
City of Boston use the external siteimprove.com to monitor the website for slow-pages, missing links etc etc. Content editors are generally monitoring this site for broken links, but the website contains much information on delivery and performance of boston.gov.
login to http://www.mysiteimprove.com and check the console for any significant/new issues highlighted.
These subscribe/unsubscribe services are contained in the module bos_email
.
There are REST endpoints provided at:
/rest/email_newsletter
/subscriptions (legacy)
Each time a deploy is performed in production, a backup is taken and saved on the Acquia environment so we can roll back if needed. Periodically we need to purge and archive these backups to conserve space on Acquia.
Follow these steps to purge older backups which are not needed.
SSH onto the Acquia Prod environment
go to backups folder
list the backups.
We should keep all on-demand backups for 3 months, after that we only need to retain the first backup in any month. Delete backups which are older than 3 months and which are not the first backup in a month. For example, in June 2022 you could use the command below to clean up backups from january 2022 - when prompted, enter "n" for the first backup in January and "y" for all subsequent backups.
Each time a deploy is performed in production, a backup is taken and saved on the Acquia environment so we can roll back if needed. Periodically we need to purge and archive these backups to conserve space on Acquia.
Ensure the Quarterly DB Purge task has been performed before starting this task.
Follow these steps to archive production database backups off Acquia and onto AWS.
SSH onto the Acquia Prod environment.
Go to the Acquia on-demand backups folder:
List the backups, check and note which backups are more that 12 months old (if the quarterly purge DB task has been regularly completed, there should only be one backup per month):
On your local machine rsync to copy down backups. For example to copy all backups from 2022:
SSH onto the Acquia Prod environment and delete the files which have been archived. Continuing the example from above (use -i to be safe):
Name | Type | Description |
---|---|---|
See on Upaknee.
Upaknee API documentation:
from local machine (in example above ~/dumps) copy up to the S3 bucket,
payload
string
A JSON formatted array of cityscore metric objects.
api-key
string
Authentication token
Update Type
Action to take
Security Updates
These should be applied ASAP. Ideally these are deployed as the only change in a new commit.
Normal Updates
It is not necessary to rush these updates and time can be taken to investigate what will be affected and consider if this will affect the website.
Even though these are not important, it is best practice to apply updates as they are ready because not applying may block other updates from happening. It may be convenient to include in the next scheduled deploy.