-
Notifications
You must be signed in to change notification settings - Fork 413
Development
- Install Dart SDK. Spark is based on Dart - a new object-oriented programming language for writing scalable web apps and reusable libraries. You must install Dart SDK in order to build and run Spark. One easy and highly recommended way to do that is to install the DEV channel of Dart Editor, which is an excellent open-source IDE for editing, debugging, and running web applications based on Dart/HTML/CSS. DO NOT use the big "DOWNLOAD DART" buttons on that page: these are for the STABLE channel, which lags behind the DEV channel in terms of added features and bug fixes, including those in the included Dart SDK and Dartium; use the 32-bit and 64-bit links under the buttons.
- Set the
DART_SDK
environment variable to point at your Dart SDK installation, and add SDK's bin directory to your PATH environment variable. For example, with bash on Mac or Linux:
export DART_SDK=/path/to/dart-sdk PATH=$PATH:$DART_SDK/bin
Don't forget to re-source you environment configuration file or reboot your machine afterwards for the changes to take effect.
On Windows, we use the 7zip utility for archiving Spark (see below). That step is optional, but if you want to run it, you should install 7zip and add its installation directory (the default is "C:\Program Files\7-Zip") to your PATH environment variable.
Below is the current de-facto debugging process that the core team uses. This process foregoes the complete build and uses the "uncompiled" version of Spark to speed up development. However, this approach relies on Spark's dependency packages not changing during the build process, as packages can do now with the recent introduction of the notion of transformers to the pub package manager, with pub build
step carrying out the transformation. So this way of debugging may become obsolete in some near future if any of the dependency packages or their dependency packages start using this feature.
Anyway, the minimum required set of steps to launch Spark in Dart Editor are:
- Add the Spark checkout directory using
File -> Open Existing Folder...
menu. - Right-click on the
ide/pubspec.yaml
and selectPub Get
, unless you ranpub get
from the command line already. -
NOTE: after the initial run of
Pub Get
, you should subsequently usePub Upgrade
instead in order to keep your dependency packages up-to-date as new versions become available. - Right-click ide/build.dart and select
Run
. This will copy the dependency packages from theide/packages
symlink (created by pub in step 2 or 3) into theide/app/packages
directory: this is necessary because of the current misalignment of pub's way of doing things and Chrome app requirements. - Right-click
ide/app/manifest.json
and selectRun As Chrome App
. This will launch Dartium, install Spark as a Chrome app into it, and launch Spark (you will see two windows opening - Dartium and Spark). At this point, you can debug the app (set breakpoints in the Dart code, etc.). - If you are not able to see the application layout correctly, You MUST turn on the "Enable experimental Web Platform features" flag in chrome://flags in your Dartium, unless you've already done so.
IMPORTANT: You should debug Spark using Dart SDK's checked mode, not production mode, in order to catch problems early. To do that, after the first launch, go to Run -> Manage Launches...
, select the launch configuration corresponding to your Spark (it will initially look something like __MSG_app_name__ (1)...
initially, but you can change it), and make sure that the Checked mode
checkbox is selected.
NOTE: Dart Editor's Run button/menu/shortcut are context dependent by default: they will launch different things depending on what kind of file you currently have opened. It might be a good idea to check the Run -> Always Run Last Launch
to make launches more predictable.
Building is done using the ./grind
script. We have two main build targets, ./grind setup
and ./grind deploy
. You should run ./grind setup
whenever you start working on the project or whenever the dependencies are changed by you or somebody else in your updated version of the code (particularly whenever the pubspec.yaml changes).
For the deploy
target:
- A full copy of the app is made to
build/deploy
- A polymer deploy step is run on that copy, with the output going into
build/deploy-out
-
dart2js
is run on the results. - A fully polymerized, JavaScript compiled version of the app is available in
build/deploy-out/web
- Use the
.\grind.cmd
batch file instead of thegrind
script.
The ./grind archive
command will run the deploy
step, and archive the results into dist/spark.zip
.
The .\grind.cmd archive
command will run the deploy
step, and archive the results into dist/spark.zip
.