Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Other historical documentation can be found here: https://drive.google.com/drive/u/0/folders/0B87NxJc5bqvTLTlfNEJVNUZtNjQ
Developed by Code for Boston, deployed and hosted by the Digital Team.
*Put in to 'maintenance mode' by Code for Boston as of March 2021. They will continue to monitor any pull requests and do their best to fix issues that come up re: broken functionality. They also asked about passing ownership fully to City. Jeanethe told them we don't currently have the bandwidth, but that we could potentially reconsider in late 2021.
--
We have a CodeBuild setup that listens to a particular branch of the CodeForBoston/voiceapp311 repo. When it receives a change it runs:
deploy_tools.py -f
to update the $LATEST
Lambda function
deploy_tools.py -i
to update the interaction model
This causes the latest changes to immediately go live in development and beta uses of the skill.
When the time comes to release changes to the interaction model, we need to do the following:
Note the Alexa Skill ID:
Access Alexa Skill,
Click on "Boston Info", and then click "Endpoint" in right column,
Copy the Skill ID
Publish the current BostonInfoSkill Lambda function as a new version, described with the current date (e.g. ”2019-05-22”).
Access AWS Lambda Console,
Select BostonInfoSkill,
Click on the "BostonInfoSkill" (orange icon) in the designer and ensure the "Function Code"
"Runtime"
is set to Python 3.7
Click on "Actions" button,
Click Publish in drop-down list
Enter date into the text-box on the pop-up dialog, and click "Publish" button.
Create a new alias pointing to that version.
select BostonInfoSkill from functions,
then select "Create Alias" from the "Actions" button.
In the UI, add “Alexa Skills Kit” as a trigger
Click on BostonInfoSkill in the Designer panel,
Add “Alexa Skills Kit” as a trigger, and paste the Skill ID (from step 1 above) into the configuration box,
Save
Update the Alexa skill’s endpoint ARN to reference the new alias.
Access Alexa Skill
Click on "Boston Info", and then click "Endpoint" in right column,
Update the Default Regions ARN to reference the alias created above.
Use the “Test” page (in the Alexa Skill) to make sure that the lambda is receiving traffic correctly
Submit the skill to Amazon for certification (snapshotting the endpoints)
After the Amazon verification process completes:
Change the Alexa skill’s endpoint back to $LATEST
It might be nice to automate this, but we’ll see how often we need to do it. Also, we could use a blue/green strategy for the aliases rather than creating new ones each time.
We don’t keep a single “production” alias because the current production skill needs to keep its same endpoint while we’re waiting for the new version of the skill to pass certification.
Because we cut releases using Lambda aliases, we can update the code behind the skill without going through a re-certification process, as long as we don’t need to change the interaction model.
This is also a manual process:
Have the code built by the development / beta updates CodeBuild so that it becomes current
Publish the current Lambda function as a new version
Update the current production alias to point to that new version
Sometimes Amazon will reject the skill. Here’s what to do afterwards, which is a combination of the above steps.
Remove the :production-X
alias from the end of the endpoint in the Alexa skills setting so that the dev skill goes back to pointing at the latest version of the Lambda function built by the CodeBuild process
Fix any issues by pushing changes to GitHub and having them be rebuilt
Once you’re sure Amazon will be satisfied:
Publish a new version of the Lambda function via the web console [console]
Update the previously-used production-X
lambda alias to point to that new version
Update the Alexa Skill endpoint settings to add :production-X
back in to the ARN
Cross fingers
Re-submit
More information about Alexa deployment strategies: https://blog.codecentric.de/en/2018/06/non-breaking-lambda-deployments-for-alexa-skills-using-versions-and-aliases/
The Alexa skill belongs to the City of Boston organization within the Amazon Developer console at https://developer.amazon.com/
Amazon Developer accounts are in the same namespace as Amazon.com shopping accounts, and are completely separate from AWS and IAM accounts.
Deployment tools authenticate with Amazon using the Login with Amazon ("LWA") system. This is based off of OAuth, and provides access and refresh tokens. These tokens allow the tools to act on behalf of accounts.
Applications using OAuth must be registered with LWA and receive a client ID and client secret. The ask
CLI tool is already registered with Amazon and has an ID and secret baked into it. For for our own tools, we register by creating a "Security Profile" in the Amazon Developer console. You can also specify a client ID and client secret when running ask
by setting the ASK_LWA_CLIENT_ID
and ASK_LWA_CLIENT_SECRET
environment variables, respectively.
To get access and refresh tokens, use ask util generate-lwa-tokens
. This will prompt for your tool’s client ID and client secret and then open a browser so that you can grant your tool access to your Amazon account for managing your Alexa skills. You can see the authorization if you go to the "Login with Amazon" section of your account on Amazon.com.
Because the access token and refresh token are tied to a particular Amazon account, we created a service account that is in the City of Boston Amazon Developer organization. The deploy tools are configured with a refresh token tied to this account.
The ask
tool is typically initialized by running ask init
, which kicks off an OAuth flow to authenticate with developer.amazon.com also tries to get AWS credentials. It then stores that information in ~/.ask/cli_config
. Because we can’t run ask init
non-interactively, we configure ask
using environment variables. These are set in the Terraform configuration for the CodeBuild resource.
ASK_LWA_CLIENT_ID
— The client ID for our Login with Amazon security profile
ASK_LWA_CLIENT_SECRET
— The secret for our Login with Amazon security profile
ASK_VENDOR_ID
— Vendor ID for our City of Boston organization
ASK_REFRESH_TOKEN
— A refresh token for our deploy tools service account, authenticating against the above security profile
Additionally, we need to provide AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
environment variables to keep ask
from trying to look in the non-existent cli_config
file. Since the deploy_tools.py
script doesn’t use the ask
tool to communicate with AWS, we just set these to dummy values.
Amazon associates a "Vendor ID" with an account organizations. You can see the Vendor ID for City of Boston by running ask api list-vendors
.
UX Testing and Enhancements 2020
A General Assembly UXDI team did user research and testing on the Alexa skill. Copy of their work here: https://drive.google.com/open?id=12lQXgtARUEt5_Uo_kBLuAlE8DyJJ-UL0
A place to document process documents and historical information regarding the Access Boston website.
Go to dropdown in menu for more.
Proposed Ways to Hand-off Config Editing and Icon Upload to the Security team.
Brainstorm/Plan ways the security team can take owner ship of the deployment of new icons and updating Access-Boston’s config file. 2 Step Deploy Plan: The Security will update the config files in S3 via SFTP and the Digital team will restart the Applications.
The first is giving them access to the S3 bucket directory containing the App config `.yml` file via SFTP. Once they make the changes they will create create an ‘Issue’ ticket in our Github project for the Digital team to schedule a deploy to the service.
The second part is triggering a deployment of their applications on AWS's ECS. Depending on the details of the change, we would go corresponding instance in either "AppsProdDeploy" or “AppStagingDeploy” and 'force a new deploy’ by using the ‘Update” the instance button.
Dedicated Git Repository Deploy Plan: The Security team commits a “Pull Request” to a private repository, whose build process will trigger a build/deploy request through Travis-CI and restart the corresponding App containers.
Setup a private repository the Security team can update.
Replicate part of the deploy process (travis, slack, etc) the mono-repo uses update the Apps in AWS
SFTP/S3 and LAMBDA Deploy
Give the Security team's SFTP Access to the S3 bucket path containing the App config files.
Create a Lambda function that gets triggered when the contents of the S3 bucket are updated
Connect the corresponding Access-Boston environment using ECS CLI and ‘force new deploy’ that application.
Resources:
Edit and commit changes to the items displayed in Access-Boston Dashboard
We created this in Github to manage changes to the Access-Boston dashboard by editing the config files for each of the environments runs on. The following are the are steps needed to commit changes, this repository will then automatically notify the digital team that a new deploy to AWS is ready to be kicked off.
Edit Process
From the repository landing page, edit the config file for the environment (dev/test/prod) you want to change, by going from the 'src' and the 'config' folder; the click on the folder for the environment you want to edit.
Click on the 'apps.yaml' file, from the details view click the 'Edit this File' icon.
Adding new links require 3 of the following fields:
title
url
*groups: Groups is a list of groups of people with access that application. The formatting should follow this style: groups
*icon: Icon is require for links in the 'Apps' section, at the top of the file
When you're done making changes, go to the bottom of the page where it says 'Commit Changes' and provide a name and description for the changes made.
Leave the "Commit directly to the 'master' branch" radio button checked
When you're done, hit the "Commit Changes" button
No you can go back to either the or to view this commits progress.
Homepage:
Commits Page:
If you see a yellow dot next to the commit, its still being processed, once its done the dot will check to a green check mark if it passed or a red x if it failed.
Passed:
Failed:
Once the build for the commit passes, our build integration with Travis notifies Digital Team will be notified via Slack that we can restart the application on AWS.
Environments
Production: https://access.boston.gov
General Documentation
Active tickets can be found at Digital Maintenance Project Board;
Historical tickets for the product can be found online here.
Flowcharts
Browser Support
Support for industry standard latest supported browsers
It is okay to require JavaScript
INACTIVE
This project is unfinished but lived inside our monorepo; we started to get security warning for modules that needed to updated but doing so was proving to be difficult. Instead of continuing to with security update we are removing this project from the mono repo. We made backs so we can get back this point if we revisit this project.
Historical wiki information on this project:
New frontend to replace 311.boston.gov when the new Salesforce-based backend rolls out.
Dev:
UAT:
GitHub:
App:
Indexer:
Mayors24 redirector:
Crowdsourcing app:
Designs:
Product: Reilly Zlab
Engineering: Fiona Hopkins
Prediction Endpoint: Albert Lee, Maria Borisova
Add in status bar. Example:
In line editing of information before submitting. Example:
Product: Kayla Patel
Engineering: Fin Hopkins, Jessica Marcus, Keith Donaldson (previously John Fleurimond)
Backend Engineering: Rich Oliver
This app is piloting the new web application development patterns outlined in Accelerating Webapp Development.
Could use react-jsonschema-form to help generate the application form.
production: https://permitfinder.boston.gov/ (UI updated by Fiona Hopkins in 2019) | staging: https://permit-finder.digital-staging.boston.gov/permit
Permit Finder is a tool for looking up the status of a building or fire permit. It is friendlier to use than Hansen and allows non-authenticated access and shows the name of the inspector currently assigned to review the permit.
The implementation in the Digital monorepo is a rewrite of the original Permit Finder, which was written in PHP/jQuery by Qlarion.
Permit Finder currently receives about 500 unique users a week.
PermitFinder Product Proposal (for future iterations, mostly completed by Susanna Ronalds Hannon)
Code archive: PMv3.27.zip
Permit Finder data is pulled out of Hansen by an ETL job running in Civis. This runs every 30 minutes. It generates 3 CSV files:
DataElementExport.csv
MilestoneExport.csv
ReviewExport.csv
These files contain information about the permit, the most recent milestone it has hit, and any reviews for the permit, respectively. All of the data is keyed off of the permit number. All dates and times are in Eastern time.
The ETL job exports the data via SFTP.
In the previous implementation, this was by SFTPing to the EC2 instance that was running the webserver. The PHP scripts would load and parse the CSV for each request.
In the rewrite, the SFTP destination is an AWS Transfer endpoint in the Digital team’s AWS account. AWS Transfer uses S3 as its backend. In our case, Civis authenticates as the civis
user using an SSH private key and the data gets written to the cob-digital-analytics-buckets
S3 bucket under permit-finder/
.
The AWS Transfer endpoint, civis
user, and bucket are all generated from Digital’s Terraform templates: sftp.tf and analytics_uploads.tf. If the user’s public key needs to be changed, or if new users or buckets need to be added, that can be done from those templates.
The containerized Node servers in the new version periodically query S3 to see if the files’ last modified date has changed. If it has, they stream the files in, parsing the CSV, and write the rows to a local, temporary Level database. (See: PermitFiles.ts)
What’s Level? Level, developed by Google and used in Chrome, is a very fast disk-based key-value store with a straightforward Node API.
There’s too much data to store comfortably in memory in the Node processes, but we don’t need the added complexity of a completely separate database or Redis store.
Level is a happy medium between the two that keeps the data localized in the Node process’s container (the databases are created under /tmp
) but without taking up too much memory.
Currently the rewrite is available on staging at permit-finder.digital-staging.boston.gov. Its UI is feature-complete relative to the current production implementation.
We have successfully tested out having Civis upload the files through the AWS Transfer endpoint and loaded them in the app.
To go to production, we need:
Product / QA signoff
The ETL job to get scheduled to run regularly
Setting the number of tasks in the Digital Prod cluster from 0 to 2
DNS change for permitfinder.boston.gov
This code was written very fast by someone in her last days at the City. So there’s that. She apologizes in advance.
We’re re-using the existing SFTP scheme for the ETL job, so that part should stay relatively stable, though we’re now SFTPing to an AWS Transfer instance rather than directly to a Linux server. We’ve validated login and upload, though, so we expect this to keep working, and potentially be even more reliable since AWS Transfer is a managed service that Amazon is responsible for maintaining.
Level itself is well-proven technology, but this is the first time we’re using it in one of our webapps. There is a chance that its disk usage, or the Docker volume it’s on, will grow over time, despite us deleting stale permit data. Unfortunately, the AWS web console does not provide any disk usage information about EBS block device, so you have to monitor disk usage by SSHing into the box and running df -hT /dev/xvda1
to see how much disk is being used overall. The Permit Finder container’s /tmp
directory will be somewhere under /var/lib/docker/volumes
, though you’ll need to sudo -s
from ec2-user
to root
to see it.
The rewrite maintained the SFTPing of CSV files just to keep the number of changes small, especially on the Analytics side. This was done to reduce risk.
Here are areas for improvement:
Have Civis upload to S3 directly. We could enable the Civis account to have access to the Digital team’s S3 bucket via IAM permissions. Civis could put the CSV files there directly, allowing us to remove the AWS Transfer endpoint (unless some other process has adopted it in the meantime).
Use a format other than CSV files. It would be ideal if Civis could generate a file per permit in JSON format, with the milestone and review data collected in it. Those files could be put directly in S3, and the web server could just download the specific permit’s data from S3 and not need to keep its own local store.
Large digital screen on the first floor of Boston City Hall that displays public notices
We support a web app that displays public notices for upcoming meetings on a TV on the 1st floor of City Hall.
https://apps.boston.gov/public-notices/
The data for this is fetched client-side from a JSON API on Boston.gov.
Product Management (at one point): Reilly Zlab
API Engineering: David Upton
Design: Sebastian Ebarb
Historical contacts:
Front-End Engineering: Fiona Hopkins
This app was previously written with Vue and hosted on Heroku. See https://github.com/CityOfBoston/notice-signage. It used Pusher to control when it reloaded data or refreshed its code.
In January 2019 we reimplemented it as a React/Next.js app in the Digital monorepo: https://github.com/CityOfBoston/digital/tree/develop/services-js/public-notices The new code fetches by itself on a 5 minute loop (with retries / exponential backoff) and automatically reloads itself when the server code changes.
It is now served as static files from S3.
Case study on the original project implementation: https://www.boston.gov/departments/digital-team/digital-team-case-studies-web-development
The app is shown on a BrightSign HD222 on the 1st floor. It’s running a presentation that consists of just one HTML5 slide pointing at the above URL.
Updates to the sign can be made by overwriting its presentation using BrightSign Author. Talk to the Digital team for the IP address, or ask NOC, since it’s reserved.
The City of Boston's Registry Department manages birth, marriage, and death records for Boston with records dating as far back as 1630. They complete more than 100,000 transactions a year. Their pages are some of the most trafficked parts of Boston.gov and provide vital services to constituents. Patty McMahon is the main end stakeholder for this.
The Department of Innovation and Technology’s (DoIT) Digital Team brought these records online.
When these applications are worked on Rich Oliver and Georges Hawat should be in the loop so that we ensure database or back end application changes happen alongside Digital development. Scott Blackwell can be a resource for this as well.
Along the way, items got added to certain of the three apps, but at times they didn't get completely added.
Items that need to be address/re-addressed: https://github.com/orgs/CityOfBoston/projects/2
Small insight into backend applications (video): https://drive.google.com/file/d/1vw_jiPGUUd8vwx9yemEqAjKkeShqkQlK/view?usp=sharing
Payment information
The applications are connected to Stripe for payment. We've explored using InvoiceCloud, but the functionality needed isn't developed/offered by InvoiceCloud. They say they're developing it and it will be usable by the end of 2021. There are ongoing conversations and contracting needs about this. The Administration and Finance Cabinet (Treasury) should always be looped in on these conversations.
International ordering is an interesting challenge for the City; these apps don't currently support international ordering/shipping. Insight from the Registry office: When a user tries to put in an international address it won't accept it. To workaround this, they advise constituents to enter the following information as the shipping address: Boston, MA 99999. This alerts staff that it is an international order. When the user gets the confirmation email, they ask that the constituent please reply to it with the full address that you want the record mailed to. After that is received, Registry processes the order. They haven't flagged this as a super high priority item, so we have not chosen to prioritize/implement.
Allow departments to have an automated appointment scheduling system to limit person-to-person contact during COVID-19. Actively using HubSpot appointments; actively piloting Acuity Scheduling.
*Ray Mejia actively helping maintain.
All documentation has been moved to Google Docs to accommodate cross department documentation: https://docs.google.com/document/d/18ysPPLnKkUCp_JxoCDQ2xQfJhCfmDi3mrZm9IT5BUk8/edit
City solely using Acuity Scheduling
HubSpot account was deleted 11/12/20
Any questions? See Natalie Schwartz, Mayor's Office Fellow
For any system outages or emergency errors, find Reilly, Matt, or Jeanethe and Carissa. Prioritize in person communication first. If need be, Reilly will enter an incident management 'ticket' and collectively Access Boston and Digital teams will discuss a plan on how to resolve following the incident management process. If this outage has happened outside of Monday-Friday, 9 a.m.-5 p.m., then Greg should contact Jeanethe, and Jeanethe will pull in the accurate resources.
If this is a non-emergency feature requests or bug, then the first step should be a meeting (or at least an in-person conversation), so that both Access Boston and Digital teams are aware of this non-emergency item and can discuss whether the work can/will be supported by Digital developers.
If applicable (i.e. if the work will be completed), during or right after this meeting, Reilly will create a ticket in the 'Digital' github repository and send an email to the Access Boston email with a link to the ticket. This ticket will include known details. If there are open questions on the ticket, then Reilly will work with Andreea to resolve these questions. Access Boston team should review the ticket to see if additional details are needed; and add these details into the github ticket.
Simultaneously, Reilly will add this github ticket into the 'Needs triage' column of the main digital project board. Issues will remain in this 'Needs triage' column until they can be discussed during a digital team sync that typically happens on Monday mornings, but periodically moves to Tuesdays due to days off.
As a general rule these non-emergency items will be completed during Digital bash weeks. These weeks typically happen every two months. Bash week dates and priorities can be tracked in github here.
Andreea will create a ticket for the icon including all of the above in the 'Digital' github repository and assign Phil and Reilly.
Reilly will add this github ticket into the 'Needs triage' column of the main digital project board. Issues will remain in this 'Needs triage' column until they can be discussed during a digital team sync that typically happens on Monday mornings, but periodically moves to Tuesdays due to days off.
When the ticket has been discussed it will be moved into the 'Low priority' column until the week it is set to go live. Tickets will be organized top to bottom with closest launch date at the top and further away date at the bottom. Tickets will be moved to the 'High priority' column the week they are set to go live. Then, once the tickets are in progress they will move through the ques as described on the project board.
When there is more than more environment (i.e. dev and/or test) both Access Boston and Digital Teams will test on the different, available environments. Digital will move the icon to production when Access Boston has confirmed good to go on any applicable test environments. If there is only a production environment, then Digital will make the change straight to production. Approvals from testing should be recorded in the github ticket.
Tickets ready for testing should be assigned to Andreea as well as flagged for her in-person for her to assign to the accurate Access Boston staff member. If she is unavailable, then the ticket can be assigned to Dinesh; be sure to add the label 'validation'.
Digital will confirm via the github ticket and verbally to Andreea when the icon is in production.
The Assessing Online web application is one of the most trafficked items on boston.gov / Boston's digital properties (see older cityofboston.gov google analytics). As part of the 'Apps Modernization' capitally funded project for DoIT/Digital we are re-doing this application. The Assessing Department is aware of these changes and is on board with the project.
Some students have done initial research for the City, find that here: - https://drive.google.com/drive/u/0/folders/16C0NGhkNFQgr1CI2Rg6WEBNH-pdCwwdV - https://drive.google.com/drive/folders/1SvR6ZZLUKsY0JIyIn0EoyqMAaC51U1-K
In early 2021, we worked with a small project group from the United State Digital Response (Joan Liu, John Sullivan Hamilton, and Chris Matthews). They reviewed the above student research, came up with product requirements, and proposed initial designs. They'd like to periodically be kept in the loop on progress, if possible.
Designs, a synthesis of the research, and copies of property bills for referenced can be found at : https://drive.google.com/file/d/1L3vlrzXBrI9vFDzkW6QJZ_s9r6BJE1po/view?usp=sharing
Project board (initially implemented by Joan Liu and picked up by Matt McGowan): https://github.com/orgs/CityOfBoston/projects/34
Data model and requirements (originally drafted by Matt McGowan; being collectively worked on between Matt and Jonathan Porter [Analytics]: https://docs.google.com/document/d/1uDnFV2Zv2TeSpO6oFYS5IkqQLMWkyhHFr6U8fptfiYE/edit?usp=sharing
Misc. documents, notes, etc. can be found here: https://drive.google.com/drive/u/0/folders/1YVq9VZKIiisBCRZStUHwMA7ZCDIdIFCZ
We have moved on to the development stage, which is being run by Matt McGowan and Phil Kelly.
This page is used to log the annual assessing process as it relates to the assessing online app.
Nicholas Ariniello (nick) - Commissioner Francis Gavin (Fran) - Assessing PO Arlande StLouis
asp/IIS cityofboston.gov (front-end for assessing-online)
MSSQL Database (contains valuations data)
Assessing Dept Access Database (Staging Database)
Munis
Tyler (aka Patriot) (aka Camea)
01 - 31 December: - Assessing Dept loads data (inventory) into Munis in prep for next calendar year, - (Fran) Uploads data (mainly from Munis) into Access Database, - (Assessing Dept) Verifies data (valuations + property attributes) in Access Database, - (Fran) Generates property valuations.
01 January: - (Assessing Dept) Finalize fiscal values and ownership, - (Digital) Online Database updated from Access Database and published to website, - (Assessing Dept) Opening of property-owner "Apply for Exemptions" and "Appeal Valuation" periods.
01 February: - Close of "Appeal Valuation" period, - (Digital) ensure form is removed from Assessing on-line app, - no Online DB Updates at this time.
01 April: - Close of "Apply for Exemptions" period, - (Digital) ensure form is removed from Assessing on-line app, - No Online DB updates at this time.
01 July: - (Assessing Dept) updates Access Database, no changes to valuations just existing inventory attributes and addition of new properties, - (Digital) Online Database updated from Access Database and published to website.
There are approx 176,000 tax bills (aka Parcels) (aka Inventory Items) (aka Properties)
boston.gov/cityscore
Contacts: Matt McGowan, David Upton, or Maria Borisova
https://registry.boston.gov/birth
There were more constraints with this product than there were with death certificates. State law prohibits access to the records of individuals whose parents were not married when they were born. These restricted records create a security and privacy challenge for us to tackle.
Also found here:
We focused on death certificates first because of the way the laws are structured in Massachusetts. Death certificates are entirely public record throughout the Commonwealth, so there are fewer privacy and security concerns associated with them.
The Registry Department has been working to digitize death certificates over the last few years, building out a database to be used for fulfillment. The database currently has records back to 1956.
We were able to leverage this database to connect it to a user-facing application on which allows constituents to order death certificates online. Payment processing is routed through . The Registry receives the payment as well as the order request and can then ship a certified copy of the death certificate to the constituent by mail.
We successfully launched the death certificates application in March 2018 and have seen increases in online purchases (versus in person or by mail) since implementation. You can read more about the project .
Original Project Team
Product Management: Rachel Braun
Engineering: Fiona (Fin) Hopkins, Jessica Marcus
Design: Caroline Stjarnborg
Database/Fulfillment: Rich Oliver, Scott Blackwell
Subject Matter Expert: Patty McMahon (City Registrar) and Registry Team
Testing/staging environment:
(Re)visit online marriage intention during COVID19. One main goal: Minimize public interaction with City Registry staff and computers.
Background research/product findings found here: https://docs.google.com/document/d/1gyvpy50TFBQj0w0z4cyV4eZRpKbQminetykqr3_rPqE/edit#
Sample forms (for printing): https://drive.google.com/drive/u/0/folders/1V6GTEN3R0E1M7USECAFhJSy1QYIA-EUn
Image of historical app: https://drive.google.com/file/d/1xpqdHD_yV8lsPKjFW336PcuDNPwbwVWC/view?usp=sharing
https://registry.boston.gov/death | Launched in March 2018
Staging: https://registry-certs.digital-staging.boston.gov/
Bunch of historical documents: https://drive.google.com/drive/u/0/folders/1cA1LXMcCAOcwKHJyYJAJWxjGwJ-C9Dpv
Login system to save shipping / credit card details and show order history
ACH support, either through Stripe customers behind a login system, or instant verification with something like Plaid
Marriage certificates
Birth certificates
Original Project Team
Product: Josh Gee, Rachel Braun
Front End Engineering: Fin Hopkins
Back End (Fulfillment) Engineering: Scott Blackwell, Rich Oliver
The City partners with vendors on native mobile apps. Most of these apps were started by other groups.
All apps except Colu are hosted on the City's Apple Developer and Google Play stores .
Misc. app specific info:
Trash Day (Recollect) - Joyce John updates addresses.
https://registry.boston.gov/marriage
Test: https://registry-certs.digital-staging.boston.gov/marriage
Historical information on this: https://drive.google.com/drive/u/0/folders/16k9PSEG6-m-4kWbl_mrP2tCqwLF6TOht
Scans of what the blank forms and certificates look like: https://drive.google.com/drive/u/0/folders/1V6GTEN3R0E1M7USECAFhJSy1QYIA-EUn
This page documents implementation of the Google Translate Basic API on boston.gov.
Previously, the Bing Translate Web Widget (a free javascript widget) was used to translate content on boston.gov. The widget was deprecated in July 2019. A new translation solution is required.
The free Google Translate JavaScript Widget is no longer being serviced and while it is still usable, the translations it yields proved too poor to move forward with this as viable option for Boston.gov. As an alternative, Google offers two versions of their Cloud Translation API: Basic (i.e. pre trained ) and Advanced API (i.e. uses your domain specific data, training required).
Translates text that appears between tags. The output retains the (untranslated) HTML tags, with the translated text between the tags to the extent possible due to differences between the source and target languages. The order of HTML tags in the output may differ from the order in the input text due to word order changes in the translation.
Using just an API KEY will make a RESTful API call to the server. This would require multiple API calls, and thus would prove costly on high traffic pages. Using the required cloud libraries would require fewer API calls with static pages stored on the server.
Cloud Translation - Advanced supports translating text using custom AutoML Translation models, and for creating glossaries to ensure that the Translation translates a customer's domain-specific terminology correctly.
Before you can use Cloud Translation - Advanced, you must enable the AutoML API (automl.googleapis.com) if you want to use AutoML custom models for your project. If you plan to use a glossary or the batch features, you also need to create a Google Cloud Storage bucket and grant your service account access to it.
Based on feature/price comparisons, we decided to move forward with the Basic API.
There are some municipalities that use the Google Translate web translator directly, rather than the API or widget, to translate content on their sites. The State of Maryland's website, for example, employs a small bit of Javascript which provides users with a Translate button and drop down in the site header. When a user selects a language in the dropdown, all the text is run through the web translator and a translated version of the page is displayed. There is no call to an API, thus no cost. Example:
This solution has zero cost, but does not offer the same flexibility in terms of implementation nor the ability to incorporate domain specific data to populate glossaries.
Based on all the available options, we decided to move forward with a short and long term translation strategy.
Use translate.google.com to provide translations for all languages provided/supported by the tool. Users should be able to translate any page or document (on pages) after selecting a language of their choice.
There is no cost associated with translation in this first iteration, given that we are proposing to use the free Google Translate web translator.
This short term strategy has no effect on any of our current content creation workflows.
Pre-translated content
We encountered an issue with existing multilingual content on boston.gov, i.e. pre-translated pages in Drupal. When, for example, user visits the boston.gov homepage and selects a given language using the translate button in the site header, the URL of the homepage (www.boston.gov) is run through the Google web translator and a translated version of the homepage is displayed to the user. When the user subsequently navigates through the site, new pages openedy the user are also opened in the Google web translator. Thus, a user could navigate from the homepage (post-translation) to the one of the existing multilingual pages on boston.gov. In this case, the Google web translator will re-translate the translated text, even if the translation setting of the translator and the language of the text on the page are the same. And most importantly, there are differences in the two version of the text, i.e. between the pre-translated text (translated by a human) and the re-translated pre-translated text (re-translated by Google web translator). To solve this issue, we followed these steps:
Integrate Basic Google Translate API into current Drupal workflow such that when a new English language page is created in Drupal, multilingual copies of that page can be automatically generated, saved as drafts, and then subsequently quality checked and published by translators.
The cost of translation for the Basic API is $20 per million characters translated. Because we are only proposing to translate pages at the moment of publication, and then saving those translated pages as unique nodes in Drupal, the translation cost for this implementation of Google Translate API will likely be significantly lower than the cost of translating pages per user requests.
This long term strategy requires some changes to our current content creation workflow:
While Metrolist loads under the offsite version of Google Translate, the React views (Search and AMI Estimator) do not populate.
React Router matches on the current URL (window.location
or document.location
).
The Google Translate widget loads the entire page into an iframe
.
Under normal circumstances, the React page loading inside of an iframe
would not break anything, since iframe
s are self-contained. The location
would still be e.g. https://www.boston.gov/metrolist/search
even if included on another domain. However, Google needs to modify the content on the page in order to translate it, and it isn’t possible to modify the contents of an iframe
from the parent page (unless they talk to each other using postMessage
). Therefore, Google merely scrapes the content of the included page and dynamically inserts it into an iframe
that it controls.
Because of the above process, the location
under Google Translate is not www.boston.gov
but rather translate.googleusercontent.com
. The path of the page becomes /translate_c
, which throws off React Router matching that expects /metrolist
.
index.bundle.js?v=2.x:2 Warning: You are attempting to use a basename on a page whose URL path does not begin with the basename. Expected path "/translate_c?depth=1&pto=aue&rurl=translate.google.com&sl=auto&sp=nmt4&tl=ja&u=https://www.boston.gov/metrolist/ami-estimator&usg=ALkJrhjYWXizTPU7YYBqcKUYUV0LgW-l5g" to begin with "/metrolist".
Google Translate also adds a base
tag to the page’s head
set to the original URL (e.g. <base href="https://www.boston.gov/metrolist/ami-estimator" />
. This is to make sure relative links won’t break from being on a different domain. Ironically this breaks navigation for Single-Page Apps, which use the HTML5 History API rather than doing a real server request. History API cannot update location
s across domains:
Uncaught DOMException: Failed to execute 'pushState' on 'History': A history state object with URL 'https://www.boston.gov/metrolist/ami-estimator/household-income' cannot be created in a document with origin 'https://translate.googleusercontent.com' and URL 'https://translate.googleusercontent.com/translate_c?depth=1&pto=aue&rurl=translate.google.com&sl=auto&sp=nmt4&tl=ja&u=https://www.boston.gov/metrolist/ami-estimator&usg=ALkJrhjWcZACKPBWvA4U6iyZm2NCx47XBw'.
Clicking the link on /metrolist/ami-estimator/result
to /metrolist/search
from within Google Translate is not possible since Google puts an X-Frame-Options
security header on the iframe
.
The top-level basename of /metrolist
was removed and the routes were updated from /
, /search
, and /ami-estimator
to /metrolist/
, /metrolist/search
, and /metrolist/ami-estimator
respectively. This removed the basename mismatch console warning, but it did not fix the routing.
We detect whether we are inside a Google Translate iframe
, i.e. if the current domain is translate.googleusercontent.com
—which should match 100% of the time, but in case that were to change we also check for translate.google.com
or the path /translate_c
—and if there is a query string present. If both conditions are met, we scan for a query string parameter pointing to /metrolist/*
, then extract the path from the first match. Google Translate re-hosts the page content by scraping whatever is specified in the u
parameter (“u” for “URL” most likely), e.g. u=https://www.boston.gov/metrolist/search
. Given that parameter is found, we can extract /metrolist/search
and then manually override the React Router location to think it is on /metrolist/search
even if it is actually on /translate_c
.
Additionally, we store references to the two Google URLs in localStorage (metrolistGoogleTranslateUrl
and metrolistGoogleTranslateIframeUrl
) for later use.
On forward/back navigation between AMI Estimator subroutes, we temporarily change the base
from https://www.boston.gov/metrolist/ami-estimator
(or equivalent dev environment) to https://translate.googleusercontent.com/metrolist/ami-estimator
. Even though the latter URL does not exist, it satisfies the necessary security conditions for navigation by keeping us on the same domain. Then, after navigating, the base
is immediately changed back to boston.gov
so links and assets do not break.
Finally, the link on /metrolist/ami-estimator/result
to /metrolist/search
is swapped out with a new Google Translate URL. If left alone, then the untranslated Search page would load inside the iframe
. So we read localStorage.metrolistGoogleTranslateUrl
and replace the u
parameter with the equivalent /metrolist/search
URL for whataver domain it’s on. This URL has to be read from localStorage because if we try to read window.parent.location.href
it will be blocked for security reasons:
Uncaught DOMException: Blocked a frame with origin "https://translate.googleusercontent.com" from accessing a cross-origin frame. It also has to be loaded in a new tab/window with
<a target="_blank"></a>
because otherwise we get another security error: Refused to display 'https://translate.google.com/translate?depth=1&pto=aue&rurl=translate.google.com&sl=auto&sp=nmt4&tl=ja&u=https://www.boston.gov/metrolist/search' in a frame because it set 'X-Frame-Options' to 'deny'.
If Google Translate changes the way their code works, this could break.
Although this fix is verifiable on CI as far as translation goes, until the appropriate CORS headers are added to Acquia, the parts of the app that rely on API data will not resolve, so it will still appear broken. Although this also has to do with cross-origin restrictions, it is completely unrelated to the Translate issue, so it is safe to ignore. But to work around this and verify that the site is indeed 100% working, you can run Chrome without security enabled. Download Chrome Canary and run this command (macOS, but you can search for your platform equivalent): open -n -a Google\ Chrome\ Canary --args --disable-web-security --user-data-dir=/tmp/chrome --disable-site-isolation-trials --allow-running-insecure-content
.
Welcome to the City of Boston’s Transportation Demand Management (TDM) Point System.
This application helps new large developments reduce vehicle trips. The tool assigns points to large developments based on their location and proximity to multi-modal transportation options. Developers are prompted to choose from a wide variety of strategies to reach the target points. The TDM plan is submitted as part of the transportation development review process.
MOBILITY SCORE:
Mobility Score is a factor that determines the target score for selecting TDM strategies.
TDM TARGET SCORE:
Development projects will need to select TDM strategies to meet this minimum number of points to satisfy Boston Transportation Department (BTD) development review requirements
COVID19 Response - https://www.usdigitalresponse.org/
Three projects running right now.
UX/IA for boston.gov/coronavirus and subsequent content
Reworking the coronavirus page information - initial recommendations delivered by Sam Moore; most of which was implemented best we could:
Reworking emergency alert on boston.gov homepage and subsequent site wide alerts - ; implementation in process. on these recommendations plus [new idea].
General social media strategy and reporting support - Prince Boucher
Graphic design support - Brandon Cole
Google translation/translation in general - Chris Guess - led to digital implementing .