@ANT_CLASSPATH@
@ANT_ALWAYS_EXCLUDE@
@ANT_EXCLUDE@
@ANT_EXCLUDE_TESTS@
Check to see if https://www.npmjs.org is up.
$PTII/mk/ptII.mk does not exist, perhaps (cd $PTII; ./configure) has not been run?
$PTII/bin/vergil does not exist, perhaps (cd $PTII/bin; make) has not been run?
This can happen if the make binary cannot be found in the path or if $PTII/mk/ptII.mk
is not present.
Skipping running make in $PTII/bin, which creates scripts like $PTII/bin/vergil.
This is only an issue if you want to invoke "$PTII/bin/vergil" instead of "ant vergil".
However, $PTII/bin does include other scripts that may be of use.
Use ant -p to see other targets. JAVA_HOME=${env.JAVA_HOME}
tools.jar is necessary for compilation of doc/doclets/.
If compilation fails, try setting JAVA_HOME to the location of the JDK.
For example:
[bldmastr@sisyphus ptII]$ which java
/usr/bin/java
[bldmastr@sisyphus ptII]$ ls -l /usr/bin/java
lrwxrwxrwx 1 root root 26 Jul 31 10:12 /usr/bin/java -> /usr/java/default/bin/java
[bldmastr@sisyphus ptII]$ export JAVA_HOME=/usr/java/default
Running make in $PTII/lbnl, which creates executables like cclient.
Typically, this is run after build-project so that the .class files are created.
$PTII/mk/ptII.mk does not exist, perhaps (cd $PTII; ./configure) has not been run?
Skipping running make in $PTII/lbnl, which creates executables like cclient.
This is only an issue if you are trying to build
the Building Controls Virtual Test Bed (BCVTB).
@ANT_MAVEN_BUILD_WEBSENSOR@
@PTADD_MODULES_ANT_JAVAC@
Building opencv for Travis.
To avoid an 'OutOfMemoryError' exception, under bash, try 'export ANT_OPTS=-Xmx1024m'
Removing .class files. Use 'ant cleanall' to remove files in reports/ and codeDoc/.
Use 'ant jars.clean' to remove the jar files created by 'ant jars'.
@ANT_WEBSENSOR_SOURCEPATH@
Target to be invoked by users that create installers in the adm/gen-X.Y subdirectory.
This target sends its output to stdout. The test.installers target sends its output to repo
Generate javadoc .xml files used for actor documentation.
Generate doc/codeDoc/allNamedObjs.txt for use by javadoc.actorIndex
For details, see $PTII/ptolemy/vergil/basic/docViewerHelp.htm
Generate actor/demonstration index files.
Read the doc/codeDoc/allNamedObjs.txt file created by javadoc.actor.
Create the actor index.
For details, see $PTII/ptolemy/vergil/basic/docViewerHelp.htm
@PTADD_MODULES_ANT@
Generate javadoc without building the actor documentation and the actor/demonstration indexCompile doc/doclets.
tools.jar is necessary for compilation of doc/doclets/.
If compilation fails, try setting JAVA_HOME to the location of the JDK.
For example:
[bldmastr@sisyphus ptII]$ which java
/usr/bin/java
[bldmastr@sisyphus ptII]$ ls -l /usr/bin/java
lrwxrwxrwx 1 root root 26 Jul 31 10:12 /usr/bin/java -> /usr/java/default/bin/java
[bldmastr@sisyphus ptII]$ export JAVA_HOME=/usr/java/default
Invoke jsdoc to generate documentation for .js files.
To set this up:
cd $PTII/vendors; git clone https://github.com/terraswarm/jsdoc.git
$PTII/doc/jsdoc/topREADME.md will be used as the basis of the first page.
The output appears in doc/codeDoc/js/index.html
The node binary was not found or does not work or npmjs.org was not reachable, so the network is probably down, so there is no point in trying npm.
If the ptdoc target fails, then try running:
ant build-project build-bin
Node was not found in the path or failed, so 'ant ptdoc' will not be run.
This means that documentation for Accessors will not be available.
Npm was not found in the path or failed, so 'ant ptdoc' will not be run.
This means that documentation for Accessors will not be available.
==test.32bit==
This target runs 32-bit JVM tests from auto32/.
Various test targets are available:
test - builds and runs test.short, test.long and test.longest
test.32bit - runs the 32-bit JVM tests in auto32/
test.long - runs tests that are fairly long (called by the test target)
test.longest - runs tests that are longest (called by the test target)
test.report.all - runs batch of tests specified by a path with wildcards (output=files)
test.report.short - runs tests that are fairly fast, generates JUnit xml
test.short - runs tests that are fairly fast (called by the test target)
test.single target runs single test specified by its class name (output=stdout)
test.batch = ${test.batch}
timeout = ${timeout.short} milliseconds
==test.capecode ==
This target runs the Cape Code accessors tests
timeout = ${timeout.long} milliseconds
${test.capecode.echo.compile}
==test.capecode1 ==
This target runs the first batch of Cape Code accessors tests
timeout = ${timeout.long} milliseconds
==test.capecode3 ==
This target runs the second batch of Cape Code accessors tests
timeout = ${timeout.long} milliseconds
==test.capecode3 ==
This target runs the third batch of Cape Code accessors tests
timeout = ${timeout.long} milliseconds
==test.capecode1.xml ==
This target runs the first batch Cape Code accessors tests and
generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.capecode2.xml ==
This target runs the second batch of Cape Code accessors tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.capecode3.xml ==
This target runs the thirdbatch of Cape Code accessors tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.core1.xml ==
This target runs the first batch of core tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.core2.xml ==
This target runs the second batch of core tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.core3.xml ==
This target runs the third batch of core tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.core4.xml ==
This target runs the fourth batch of core tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.core5.xml ==
This target runs the fifth batch of core tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.core6.xml ==
This target runs the sixth batch of core tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.core7.xml ==
This target runs the seventh batch of core tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.export1.xml ==
This target runs the first batch of export demo tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.export2.xml ==
This target runs the second batch of export demo tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.export3.xml ==
This target runs the third batch of export demo tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.export4.xml ==
This target runs the fourth batch of export demo tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.export5.xml ==
This target runs the fifth batch of export demo tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.export6.xml ==
This target runs the sixth batch of export demo tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
==test.export7.xml ==
This target runs the seventh batch of export demo tests
and generates JUnit xml output.
timeout = ${timeout.long} milliseconds
To run code coverage on just one test, use
ant test.cobertura.single
test.cobertura: Run the short tests with timeout=${timeout.short} test.cobertura: Run the 32-bit tests with timeout=${timeout.short} test.cobertura: Run the long tests with timeout=${timeout.long} test.cobertura: Run the longest tests with timeout=${timeout.longest}Run the longest tests using code coverage
Run the longest tests with timeout=${timeout.longest}
==test.corbertura.single==
To run code coverage on a different test, use the -Dtest.name, for example:
ant test.cobertura.single -Dtest.name=ptolemy.configs.test.junit.JUnitTclTest
The formatter is controlled by junit.single.formatter, which defaults to plain
Other values are brief, failure and xml.
Run the tests in test.name=${test.name} with timeout=${timeout.short}
Target to be invoked by the nightly build that create installers in the adm/gen-X.Y subdirectory.
This test generates .xml files in reports/junit. The installers target sends its output to stdout.
==test.long==
This target runs test specified by a file name and generates
human readable output to stdout.
The default is to run all the fairly long-running tests.
Various test targets are available:
test - builds and runs test.short, test.long and test.longest
test.32bit - runs the 32-bit JVM tests in auto32/
test.long - runs tests that are fairly long (called by the test target)
test.longest - runs tests that are longest (called by the test target)
test.report.all - runs batch of tests specified by a path with wildcards (output=files)
test.report.short - runs tests that are fairly fast, generates JUnit xml
test.short - runs tests that are fairly fast (~30 min., called by the test target)
test.single target runs single test specified by its class name (output=stdout)
timeout = ${timeout.long} milliseconds
==test.longest==
This target runs test specified by a file name and generates
human readable output to stdout.
Various test targets are available:
test - builds and runs test.short, test.long and test.longest
test.32bit - runs the 32-bit JVM tests in auto32/
test.long - runs tests that are fairly long (called by the test target)
test.longest - runs tests that are longest (called by the test target)
test.report.all - runs batch of tests specified by a path with wildcards (output=files)
test.report.short - runs tests that are fairly fast, generates JUnit xml
test.short - runs tests that are fairly fast (~30 min., called by the test target)
test.single target runs single test specified by its class name (output=stdout)
timeout = ${timeout.longest} milliseconds
==test.mocha==
This target uses mocha to test Node.js tests in **/mocha/test*.js files.
This target requires setup:
sudo npm install -g mocha
sudo npm install -g mocha-junit-reporter
See https://chess.eecs.berkeley.edu/ptexternal/wiki/Main/JSMocha
Running test.mocha in ${converted}
==test.mocha.xml==
This target uses mocha to test Node.js tests in **/mocha/test*.js files.
This target requires setup:
sudo npm install -g mocha
sudo npm install -g mocha-junit-reporter
See https://chess.eecs.berkeley.edu/ptexternal/wiki/Main/JSMocha
The output will appear in reports/junit/mochaJUnit.xml
==test.report==
This target runs all the tests (short, long, longest) and generates
${junit.formatter} output in ${junit.output.dir}.
test - builds and runs test.short, test.long and test.longest
test.32bit - runs the 32-bit JVM tests in auto32/
test.long - runs tests that are fairly long (called by the test target)
test.longest - runs tests that are longest (called by the test target)
test.report.all - runs batch of tests specified by a path with wildcards (output=files)
test.report.short - runs tests that are fairly fast, generates JUnit xml
test.short - runs tests that are fairly fast (~30 min., called by the test target)
test.single target runs single test specified by its class name (output=stdout)
test.batch = ${test.batch}
timeout.long = ${timeout.long}
==test.report.short==
This target runs test specified by a file name and generates
${junit.formatter} output in ${junit.output.dir}.
The default is to run all the fairly short-running tests.
On a fast machine, these tests take around 30 minutes.
To run different tests, use -Dtest.batch, for example:
ant test.short -Dtest.batch=ptolemy/domains/continuous/**/junit/*.java
test - builds and runs test.short, test.long and test.longest
test.32bit - runs the 32-bit JVM tests in auto32/
test.long - runs tests that are fairly long (called by the test target)
test.longest - runs tests that are longest (called by the test target)
test.report.all - runs batch of tests specified by a path with wildcards (output=files)
test.report.short - runs tests that are fairly fast, generates JUnit xml
test.short - runs tests that are fairly fast (~30 min., called by the test target)
test.single target runs single test specified by its class name (output=stdout)
test.batch = ${test.batch}
timeout.long = ${timeout.long}
==test.short==
This target runs test specified by a file name and generates
human readable output to stdout.
The default is to run all the fairly short-running tests.
On a fast machine, these tests take around 30 minutes.
To run different tests, use -Dtest.batch, for example:
ant test.short -Dtest.batch=ptolemy/domains/continuous/**/junit/*.java
Various test targets are available:
test - builds and runs test.short, test.long and test.longest
test.32bit - runs the 32-bit JVM tests in auto32/
test.long - runs tests that are fairly long (called by the test target)
test.longest - runs tests that are longest (called by the test target)
test.report.all - runs batch of tests specified by a path with wildcards (output=files)
test.report.short - runs tests that are fairly fast, generates JUnit xml
test.short - runs tests that are fairly fast (called by the test target)
test.single target runs single test specified by its class name (output=stdout)
test.batch = ${test.batch}
timeout = ${timeout.short} milliseconds
${test.short.echo.compile}
==test.single==
Run a single JUnit test by classname (ptolemy.actor.lib.test.junit.JUnitTclTest).
To run a different test, use
-Dtest.name=ptolemy.actor.lib.test.junit.JUnitTclTest
and
-Djunit.single.formatter=xml|brief|plain|failure (default is plain)
For example:
* Run all the tests in actor/lib/test
ant test.single
* Run all the tests in kernel/test
ant test.single -Dtest.name=ptolemy.kernel.test.junit.JUnitTclTest
Various other test targets are available:
test - builds and runs test.short, test.long and test.longest
test.32bit - runs the 32-bit JVM tests in auto32/
test.cobertura - run tests using code coverage
test.long - runs tests that are fairly long (called by the test target)
test.longest - runs tests that are longest (called by the test target)
test.report.all - runs batch of tests specified by a path with wildcards (output=files)
test.report.short - runs tests that are fairly fast, generates JUnit xml
test.short - runs tests that are fairly fast (called by the test target)
test.single target runs single test specified by its class name (output=stdout)
==test.travis.timeout.fail.xml==
This target runs a test that will fail, indicating a timeout problem with Travis.
If the Travis continuous integration system needs to update the OpenCV cache,
then there is a chance that building OpenCV will take up too much time and the
rest of the build will fail.
If too much time is taken up, then we should skip running the tests, but
then we need to have a test failure that indicates that the other tests did not run.
@PTADD_MODULES_ANT@