diff --git a/.run/Connector Consumer Corp.run.xml b/.run/Connector Consumer Corp.run.xml
index 08596021..504ec36e 100644
--- a/.run/Connector Consumer Corp.run.xml
+++ b/.run/Connector Consumer Corp.run.xml
@@ -8,8 +8,9 @@
+
-
\ No newline at end of file
+
diff --git a/.run/Connector _provider-manufacturing_.run.xml b/.run/Connector _provider-manufacturing_.run.xml
index 5b307483..6542f031 100644
--- a/.run/Connector _provider-manufacturing_.run.xml
+++ b/.run/Connector _provider-manufacturing_.run.xml
@@ -7,8 +7,9 @@
+
-
\ No newline at end of file
+
diff --git a/.run/Connector _provider-qna_.run.xml b/.run/Connector _provider-qna_.run.xml
index 9cd98e78..5534122d 100644
--- a/.run/Connector _provider-qna_.run.xml
+++ b/.run/Connector _provider-qna_.run.xml
@@ -6,8 +6,9 @@
+
-
\ No newline at end of file
+
diff --git a/.run/IdentityHub Consumer Corp.run.xml b/.run/IdentityHub Consumer Corp.run.xml
index 5322e94c..0568553f 100644
--- a/.run/IdentityHub Consumer Corp.run.xml
+++ b/.run/IdentityHub Consumer Corp.run.xml
@@ -7,8 +7,9 @@
+
-
\ No newline at end of file
+
diff --git a/.run/IdentityHub Provider Corp.run.xml b/.run/IdentityHub Provider Corp.run.xml
index c11d541e..c14fa041 100644
--- a/.run/IdentityHub Provider Corp.run.xml
+++ b/.run/IdentityHub Provider Corp.run.xml
@@ -6,8 +6,9 @@
+
-
\ No newline at end of file
+
diff --git a/.run/Provider Catalog Server.run.xml b/.run/Provider Catalog Server.run.xml
index 200345d9..0ee648f8 100644
--- a/.run/Provider Catalog Server.run.xml
+++ b/.run/Provider Catalog Server.run.xml
@@ -7,8 +7,9 @@
+
-
\ No newline at end of file
+
diff --git a/README.md b/README.md
index 078b00a0..1d534d31 100644
--- a/README.md
+++ b/README.md
@@ -106,8 +106,8 @@ Consumer Corp has a connector plus its own IdentityHub.
"provider-qna" and "provider-manufacturing" both have two data assets each, named `"asset-1"` and `"asset-2"` but
neither "provider-qna" nor "provider-manufacturing" expose their catalog endpoint directly to the internet. Instead, the
-catalog server (of the Provider Corp) provides a catalog that contains special assets (think: pointers) to both "
-provider-qna"'s and "provider-manufacturing"'s connectors, specifically, their DSP endpoints.
+catalog server (of the Provider Corp) provides a catalog that contains special assets (think: pointers) to both
+"provider-qna"'s and "provider-manufacturing"'s connectors, specifically, their DSP endpoints.
We call this a "root catalog", and the pointers are called "catalog assets". This means, that by resolving the root
catalog, and by following the links therein, "Consumer Corp" can resolve the actual asset from "provider-qna" and
@@ -211,7 +211,7 @@ latter is a compound run config an brings up all other runtimes together.
### 4.1 Start NGINX
-The issuer's DID document is hosted on NGINX, so the easiest way of running NGINX is with a docker container:
+The issuer's DID document is hosted on NGINX, so the easiest way of running NGINX is with a docker container (Windows users remove all `"` around `$PWD`, and execute this command using the Powershell in the `MinimumViableDataspace` directory):
```shell
docker run -d --name nginx -p 9876:80 --rm \
@@ -266,15 +266,17 @@ The connector runtimes contain both the controlplane and the dataplane. Note tha
likely be separate runtimes to be able to scale and deploy them individually. Note also, that the Kubernetes deployment
(next chapter) does indeed run them as separate pods.
-The run configs use the `temurin-22` JDK. If you don't have it installed already, you can choose to install it (IntelliJ
-makes this really easy), or to select whatever JDK you have available in each run config.
+Select and execute the `dataspace` run configuration to start the runtimes now.
+
+The run configs use the `temurin-21` JDK. If you don't have it installed already, you can choose to install it (IntelliJ
+will prompt you to do so), or to select whatever JDK you have available in each run config.
All run configs take their configuration from `*.env` files which are located in `deployment/assets/env`.
### 4.3 Seeding the dataspace
DID documents are dynamically generated when "seeding" the data, specifically when creating the `ParticipantContext`
-objects in IdentityHub. This is automatically being done by a script `seed.sh`.
+objects in IdentityHub. This is automatically being done by a script `seed.sh`. Windows user might need to configure a shell interpreter in IntelliJ first, for which they could use `C:\WINDOWS\system32\CMD.exe` with script option `/C`.
After executing the `dataspace` run config in Intellij, be sure to **execute the `seed.sh` script after all the runtimes
have started**. Omitting to do so will leave the dataspace in an uninitialized state and cause all
@@ -284,10 +286,9 @@ connector-to-connector communication to fail.
All REST requests made from the script are available in the [Postman
collection](./deployment/postman/MVD.postman_collection.json). With the [HTTP
-Client](https://www.jetbrains.com/help/idea/http-client-in-product-code-editor.html) and [Import from Postman
-Collections](https://plugins.jetbrains.com/plugin/22438-import-from-postman-collections) plugins, the Postman collection
-can be imported and then executed by means of the [environment file](./deployment/postman/http-client.env.json),
-selecting the "Local" environment.
+Client](https://www.jetbrains.com/help/idea/http-client-in-product-code-editor.html) plugin, the Postman collection
+can be imported (right click on the file in IntelliJ, then `Convert Collection to .http File`).
+Then, open the generated .http file in IntelliJ, do a search-and-replace (change all `Content-Type: text/plain` to `Content-Type: application/json`), select "run with environment: Local" ([environment file](./deployment/postman/http-client.env.json)), and run chosen requests.
Please read [chapter 7](#7-executing-rest-requests-using-postman) for details.
@@ -859,4 +860,4 @@ into a filter expression, for example `org.eclipse.edc.vc.type:DataProcessorCred
query for `DataProcessorCredentials` in the database.
The MVD uses the default `EdcScopeToCriterionTransformer` to achieve this. It is recommended to implement a custom
-`ScopeToCriterionTransformer` for an actual production scenario.
\ No newline at end of file
+`ScopeToCriterionTransformer` for an actual production scenario.