Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Historical emergency alert sign up - moved to Everbridge April 2020
Notes as of 1/16/18 (plus relevant edits)
Takes form data from a component on Boston.gov (example here: https://www.boston.gov/departments/emergency-management#sign-up-for-alerts) and and processes it with the CodeRed API.
CodeRed is a software used by the City to send emergency notifications. People need to opt into the system.
We have a proxy application that sits in the middle, because CodeRed/Alert Boston requires you to login every time before interacting with it, the front end UI of Alert Boston does not match up with the City's brand, and we can not easily integrate/embed the Alert Boston/CodeRed UI into boston.gov: https://public.coderedweb.com/cne/en-US/BFB5F355FAB8 (this does the same thing as the form on boston.gov, but also allows for multilingual sign up)
Contacts (no longer with City):
Product: Josh Gee
Engineering: Matt Crist, Fin Hopkins
Documents:
Contact Form
This application receives the content and headers from our contact form (email) component, stores the data and queues it for delivery to our mail client (postmark). This app is hosted on our AWS as an ECS container along with other Services-JS apps. Every instance where the contact form is used corresponds to an entry in a database that issues an application token and stores where each form should send the email too.
When a contact form is submitted it sends the form data and the application token to a NodeJS service on AWS. This service stores the form data in a database, this data waits for a recurring process that will send the latest entry in the DB to our mail service (Postmark) that will queue and send each email to their corresponding recipient.
Notes from 2017:
Original concept 2012 - Had map based tool
Debut in Feb. 2013 > was in winter storm Nemo
1 million hits in first hour because of national story
Latest launch 2015.
Related news articles:
Two tools:
Javascript script to show who is in which piece of equipment. | Is this Snowats > not going away?
GPS
Have either been sitting there for more than 20 minutes? > snowcop
Changing our GPS trackers > timeline unknown
Trimble (old provider) running throughout the winter
Ping time 90 seconds for hardwired units, 120 seconds for hand held units for contractors
Samsara (new provider) get up and running while Trimble keeps running
Ping time 4 seconds
Can grab via API
City has ~100 that are hard install of boxes in the vehicles. Have about ~700 contractor units, much like Sprint phone.
Currently redirecting to boston.gov/snow
--
Allows residents to monitor Boston’s snow operations including an overview of the season showing the total miles and hours plowed, salt used, and total snowfall. Also shows what phase the snow operation is in, and what tasks are being completed in each snow response phase.
Links
App: http://snowstats.boston.gov/ [currently unresponsive] Github: https://github.com/CityOfBoston/snowstats
Contacts
Historical App Engineering: Max Handler/Qlarion, Claire Lane
Public Works Representative: Paul Taylor
Latest iteration launch: 2015 Related news articles:
1. https://www.cityofboston.gov/news/Default.aspx?id=18976
2. https://www.americaninno.com/boston/boston-snow-stats-mayor-marty-walsh-announces-snowstats-boston-gov/ 3. http://www.wbur.org/news/2015/02/02/boston-snow-stats
Not relevant for active project, but for historical information/reference. Last updated July 2018.
We're working to provide a digital form alongside all paper forms and pdfs in the City of Boston. If you want to come to City Hall to do business with us, we'll be here for you. But, if your only option is to come to a City office to do business with us, then we can do better.
When we City of Boston Digital Team started our project to digitize 'all forms', we didn't have a list of what forms residents could fill out online. We had to start by scraping the City's old website (cityofboston.gov) for all pdf files. That, plus interviews with departments and reviews of other documentation helped us pull together an initial list. Even with months of auditing, we're still finding new forms every day.
Early in the process, we went through a bid process for a pilot program for a web form tool. We solicited three bids and procured a partner, SeamlessDocs. Our goal in the pilot was to move fast and learn. We weren’t trying to fix every department’s process but instead to focus on the customer experience and digitize paper forms and pdfs.
There is huge demand to move forms online in the departments.
We initially thought there would be a strong demand for submissions that look exactly like current paper forms. That hasn't been the case.
Departments almost immediately asked for tools to help them move to all-digital processes;
We've avoided getting into workflow changes, but there is a growing need for a workflow tool;
Functionality around conditional fields, logic, and branching are really important;
We've learned that not all forms are created equal. Some are simple, but most of them are either interrelated or kick off complicated business processes. Just moving those online isn't much help. Some are really also better served as applications. A good example is death certificates. After learning more about the process, we realized that a form isn't a good solution since we wanted to give residents the ability to look up and order a certificate once they knew we had it (Right now, they request a record blind). So we are building an entire application that allows residents to look up and order death certificates.
As we expand, we'll need both a tool for building forms and a flexible workflow tool into which information from simple forms and web applications can be fed. We've done some analysis, and are working on an RFP around that. Since that's a bigger project, we're currenctly coordinating among multiple stakeholders to release the RFP.
However, the pilot has also shown us we'll always need the ability to quickly and easily create branded forms.
Known bugs/how to resolve: Hello, I attempting to customize submission email notifications. I want to have a 'full name' field show up in the subject of the email but the only field (out of a handful of fields) available for me to map is a single line input field. Any thoughts? 5:51 pm December 22
Hey there, is the Full Name field assigned to a specific signer? sorry what do you mean
Sometimes if fields are assigned to a specific signer, they can't be pulled into the email subject. But actually are you working with a SeamlessDocs (PDF) or Web Form? Could you send me a link to your form? https://boston.seamlessdocs.com/forms/builder/CO17121000048706650 They shouldn't be assigned to a specific signer we want whatever applicant put in their name to appear in the subject line
I see. Try using a Single Line field for Full Name, instead of the Full Name field. That should work.
Deployment practice and workflows from March 2019.
The following is a table showing the various stages of a deploy to production.
Developer checks out the develop
branch of the main repository.
Developer builds local docker container, and builds Drupal site in container.
Developer creates new working branch e.g. my-branch
.
Developer makes necessary changes to website and/or to PHP code.
Developer tests changes locally.
There are scripts which can be used to make the container and build Drupal.
Important - don't forget ....
Developer updates features (using Drupal features module) as needed.
Developer commits code and features changes to my-branch
branch of local repository.
Develop runs local PHPUnit, behat and linting tests.
Developer pushes local branch my-branch
to a branch of the same name (i.e. my-branch
) on the CoB GitHub repository.
Developer creates a new Pull Request (PR) to merge my-branch
into develop
on the boston.gov-d7 GitHub repository.
Developer provides appropriate notes (in template form) in the PR comments.
Developer assigns a peer-developer to review the code.
Once a new PR to develop
is created in GitHub, Travis starts a build verification process, which attempts to build a new Drupal site from the files my-branch
, and then run various linting, PHPUnit and Behat tests.
Developer mergesmy-branch
into develop
when the peer-review is complete and the Travis build tests pass.
Developer deletes my-branch
locally and on GitHub when the merge to develop
is complete.
It is acceptable to use the GitHub "Squash and Merge" function when merging a branch into develop
.
Travis monitors the GitHubdevelop
branch, and when a commit is performed (either a direct commit or a merge process) to the branch Travis starts a deploy process:
The dev
Acquia server/environment monitors the Acquia develop-build
branch and when that branch is updated (i.e. a merge/commit is made) it automatically pulls the updated code onto the appropriate server, and
backs up the database on the Acquia dev
environment
copies the database from the Acquia stage
environment to the Acquia dev
environment.
runs processes on the dev
environment to sync the (updated) code and the (copied) database.
A Senior Developer from the team creates a PR to merge develop
into master
when a deploy is desired. Ideally this is done frequently with just a single branch pre-deploy.
Reviewers are assigned (see reviewer notes here):
The City of Boston Website Product Manager, and
QA representative.
The Senior Developer merges the PR into master
.
Do not use GitHub's "Squash and Merge" feature when merging the PR as this breaks consistency between master
and develop
on GitHub.
Travis monitors the GitHub stage
branch, and when a commit is performed (either a direct commit or a merge process) to the branch Travis starts a deploy process:
The deploy process:
re-runs the build tests,
clones the master-build
branch from an Acquia managed git repository,
copies the built website from the Travis container over files in the cloned branch,
and then commits the master-build
branch to the Acquia repository.
The Acquia server/environment monitors the Acquia repository
branch and when it is updated (a commit is made) pulls the updated code onto the appropriate server.
The stage
Acquia server/environment monitors the Acquia master-build
branch and when that branch is updated (i.e. a merge/commit is made) it automatically pulls the updated code onto the appropriate server, and
backs up the database on the Acquia stage
environment, and
copies the database from the Acquia prod
environment to the Acquia stage
environment, and
runs processes on the stage
environment to sync the (updated) code and the (copied) database.
After testing is competed:
The City of Boston Website Product Manager (or someone delegated) completes change and release documentation, and
Ensures the developer(s) who made the changes in the release is available in case of issues in production, and then
Using the Acquia Cloud web-UI: the Product Manager (or someone delegated) drags the code from the stage
environment to the prod
environment.
Acquia-hooks will detect that the code has been moved and will:
backup the Production
database,
run drush commands to update configurations from the new code into the Production
database.
==========================================================================
The automated deploy process follows continuous deploy (CD) principles whereby:
The deploy workflow is engineered so that all developers are able and enabled to perform a deployment,
Wherever possible, the workflow is automated to remove the need for manual tasks and testing.
The primary tools used by City of Boston in the CD workflow process are:
Docker to manage local development environments.
GitHub for code storage and deploy initiation.
Travis for automated testing, building and packaging.
Acquia Cloud (acapi and cloud webhooks) for deployment.
Secondary tools used by City of Boston in the CD process are:
Phing to abstract scripting processes used in build, test and packaging.
PHPUnit and behat to perform automated testing.
Overall the engineering workflow is as follows:
City of Boston operate 2 websites (hub and boston), each being an "application" on Acquia. The Drupal code-base on the 2 websites is very similar, the main differences being some settings files and the content contained in the applications database.
The overall engineering utilizes a single repo in GitHub, and a single repo in Acquia's Git.
The workflow engineering creates and manages a single develop
and single master
branch in GitHub -these ultimately are common to hub and boston websites.
However, the Acquia environments (dev, stage, prod) in each application (boston, hub) are attached to different branches of the Acquia git. Hub branches are suffixed with -hub
. These branches are created by and during the Travis packaging process.
Documentation for historical context
-- DECOMMISSIONED Mid 2022 --
Project Background
Project came out of Department of Neighborhood Development (DND)
Qlarion built
When originally developed open data portal didn’t exist/wasn’t in its current state to be used could use ckan API
Launched publicly early August 2017 - no known issues or outages since launch
Iterations
Been rebuilt a few different times
Original version was built with additional data sources, but was parred down to show valuable info quickly
Initially very GIS focused
Technology used
Hosted on heroku [migrated from Qlarion to City's heroku in July 2018]
Uses heroku resources - uses single production dyno
Could be moved to AWS [down the road], but would need to create a PHP container in the current AWS setup
Built in php and javascript
Bootstrap front end [Sebastian advised on this at some point]
Data services (SAP) sits on City side. Every night loads/nightly job into postgres database (data services drops data then PHP to post gres to stage the data/make it look nice). SAP = ATl platform. PHP script to get data to website.
Assurance that the data runs before page load is set within the code.
index.html - served on apache
Some known data sources:
ISD
Lagan/311
Assessing
Open question: Are there more?
Areas for improvement
Adjust dynos and other heroku resource (to bring cost of app down)
Adjust data source/feed set up
Revisit entire structure: Qlarion estimated would probably take 1 week of time to switch the data sources and restructure data
Heroku app backed up to github, and database and some creds saved on s3 cob_digital_archives.
DNS redirect for rentsmart.boston.gov =>