Contributing to Spark

Are you interested in helping to improve the Spark Design System? We welcome contributions from both designers and engineers.


All contributions to Spark Design System are expected to follow our change workflow process.

Type of change

Contributions are divided into two types: minor changes and substantial changes.

Minor Changes

Minor changes include, but are not limited to:

  • Bug fixes
  • Performance improvements
  • Accessibility improvements
  • Unit tests
  • Spelling and grammar fixes

Minor changes should begin with an issue on GitHub if you are unable to make the change yourself or are unsure if the change you want to make is needed. If you make the changes yourself, you can open a pull request for review.

If a minor change turns out to be substantial, the Spark team may ask you to go through the contribution workflow for substantial changes instead.

Substantial Changes

Substantial changes generally fall into two categories: New components and amendments to an existing component that change its design or behavior. This type of change requires you to complete our contribution workflow to determine if a new component or change to an existing component aligns with the direction Spark is going.

Code Style Standards

Spark follows specific coding styles for HTML, CSS and JavaScript to ensure maintainability and scalability. To successfully make a commit in this repo the code must pass the pre-commit hooks that will run automatically on commit. The pre-commit hooks run ESLint, Stylelint and an additional code formatter, prettier.


  • Two spaces for indentation.
  • For better code readability the attributes of elements should each be on their own line when the number of attributes makes that element exceed a line length of 80 characters.



  • Two spaces for indentation.
  • Spark JS will use new features from ESNext and assumes applications using Spark have a JavaScript compiler setup.
  • Spark uses ESLint for JS linting.
  • ESLint is setup to use the Airbnb JavaScript Style Guide and Spark JS coding conventions come from there.
  • We follow JS recommendations from the Quicken Loans JS Concord Group.
  • Data attributes on DOM elements is the chosen method for DOM selection.

Code Organization

Spark is managed as a monorepo. All of the Spark source code lives in a single repo, but is released as separate packages using Lerna.

This repo consists of the design system packages, wrapped in an instance of Drizzle, a tool built by cloudfour, for displaying pattern libraries. Spark uses Drizzle for documentation and plain html code examples.

In the packages folder are Spark-Core and Spark-Extras. These are the files that are published to npm.

Running the Spark Docs site locally

  1. npm install
  2. gulp dev-all
  3. Open your browser to http://localhost:3000/.

Contribution Workflow

Coming soon.

Once you have determined that a new component or change is necessary, please open an issue using the "Contribution Proposal" issue template in GitHub to start the review process.

The Spark team will review your proposal and determine if it's a good fit. If it is, we will either add it to our backlog, or work with you on contributing the changes yourself.