-
Notifications
You must be signed in to change notification settings - Fork 413
Development
Spark is based on Dart - a new object-oriented programming language for writing scalable web apps and reusable libraries. You must install Dart SDK in order to build and run Spark. One easy and highly recommended way to do that is to install the DEV channel of Dart Editor, which is an excellent open-source IDE for editing, debugging, and running web applications based on Dart/HTML/CSS. DO NOT use the big "DOWNLOAD DART" buttons on that page: these are for the STABLE channel, which lags behind the DEV channel in terms of added features and bug fixes, including those in the included Dart SDK and Dartium; use the 32-bit and 64-bit links under the buttons.
On Windows, we use the 7zip utility for archiving Spark (see below). That step is optional, but if you want to run it, you should install 7zip and add its installation directory (the default is "C:\Program Files\7-Zip") to your PATH environment variable.
Below is the current de-facto debugging process that the core team uses. This process foregoes the complete build and uses the "uncompiled" version of Spark to speed up development. However, this approach relies on Spark's dependency packages not changing during the build process, as packages can do now with the recent introduction of the notion of transformers to the pub package manager, with pub build
step carrying out the transformation. So this way of debugging may become obsolete in some near future if any of the dependency packages or their dependency packages start using this feature.
Anyway, the minimum required set of steps to launch Spark in Dart Editor are:
- Add the Spark checkout directory using
File -> Open Existing Folder...
menu. - Right-click on the
ide/pubspec.yaml
and selectPub Get
, unless you ranpub get
from the command line already. -
NOTE: after the initial run of
Pub Get
, you should subsequently usePub Upgrade
instead in order to keep your dependency packages up-to-date as new versions become available. - Right-click ide/build.dart and select
Run
. This will copy the dependency packages from theide/packages
symlink (created by pub in step 2 or 3) into theide/app/packages
directory: this is necessary because of the current misalignment of pub's way of doing things and Chrome app requirements. - Right-click
ide/app/manifest.json
and selectRun As Chrome App
. This will launch Dartium, install Spark as a Chrome app into it, and launch Spark (you will see two windows opening - Dartium and Spark). At this point, you can debug the app (set breakpoints in the Dart code, etc.).
IMPORTANT: You should debug Spark using Dart SDK's checked mode, not production mode, in order to catch problems early. To do that, after the first launch, go to Run -> Manage Launches...
, select the launch configuration corresponding to your Spark (it will initially look something like __MSG_app_name__ (1)...
initially, but you can change it), and do one of the following:
- Current version on DEV channel of the Dart Editor (as of 12/19/2013): add
DART_FLAGS=--enable-checked-mode
toEnvironment variables
. - Some near future version: make sure that the
Checked mode
checkbox is selected.
NOTE: Dart Editor's Run button/menu/shortcut are context dependent by default: they will launch different things depending on what kind of file you currently have opened. It might be a good idea to check the Run -> Always Run Last Launch
to make launches more predictable.
Building is done using the ./grind
script. We have two main build targets, ./grind deploy
and ./grind deploy-test
.
For the deploy
target:
- A full copy of the app is made to
build/deploy
- A polymer deploy step is run on that copy, with the output going into
build/deploy-out
-
dart2js
is run on the results. - A fully polymerized, JavaScript compiled version of the app is available in
build/deploy-out
Similarly for the deploy-test
target:
- App copied into
build/deploy-test
- Polymer deployed, and dart2js compiled, into
build/deploy-test-out
- That version of the app has our tests compiled in, and can test itself.
- Use the
.\grind.cmd
batch file instead of thegrind
script. The build targets become, respectively,./grind.cmd deploy
and.grind.cmd deploy-test
.
The ./grind archive
command will run the deploy
step, and archive the results into dist/spark.zip
.
The .\grind.cmd archive
command will run the deploy
step, and archive the results into dist/spark.zip
.