-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for Apache Spark(Core module). #20
Open
ZHLOLin
wants to merge
20
commits into
xlab-uiuc:spark
Choose a base branch
from
ZHLOLin:spark-core
base: spark
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Default value for configuration. Collected from spark's official document. https://spark.apache.org/docs/latest/configuration.html
Update constant to support Spark Core module. LOCAL_SUREFIRE_PATH for spark-core is not added since Scalatest will not generate surefire-reports that contains CTest log. The reports for parsing is generated by runner scripts.
The Scalatest module allows the use of white space(" ") in test method names. To support the spark-core module, I have updated the variable used in split methods called in lines 28, 29, 39, and 40.
Update runner script. Add command for running Scala test in Maven. Add write report method to generate CTest report for parsing.
Add support for spark core module
Add support for spark-core module. CTEST_SPARK_DIR is the directory for POM file.
Add support for spark core module. Injection is done by adding system configuration to the pom file.
Many methods in run_test_utils do not support scala test command.
Update run_test.py according to the methods changed in utils.
Add report generator.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Add support for Apache Spark(Core module).
Details:
Logging Configuration API
-Code changes
Since Spark is written in Scala and uses the scalatest plugin for unit testing, there are many variations in the runner and collector.
For runner.py:
Add write_report function to generate reports for parsing since scalatest do not generate reports that contains ctest logging information.
Modify some condition statement to support spark.
For collector.py:
Update some function to support projects that allow white space in test name.
Scripts and Const:
modify add_project.sh, identify_param.sh, constant.py, and constant.py to for setting up and generate mapping for Spark.
-Data collected
conf_params.txt, test_method_list.json, spark-core-default.tsv, and opensource-spark-core.json
Intercept Configuration API:
-Code changes
For inject.py
Update injection scripts to support Spark.
The spark store all default configuration as static singleton ConfigEntry objects. The The SparkConf object will load the user-specified configuration from the system properties of the JVM, So I modified the POM file to specify the system properties used by the Scalatest plugin to override the configuration loading.
For run_test_utils.py
Add maven command for running Spark
For run_test.py
Update run_test.py according to the methods changed in utils.