Skip to content

Commit

Permalink
Merge pull request #720 from EBISPOT/dev
Browse files Browse the repository at this point in the history
Merge boolean and numeric fixes for v2-api-inconsistencies
  • Loading branch information
henrietteharmse authored Aug 14, 2024
2 parents df6805c + c47d31e commit 606eb62
Show file tree
Hide file tree
Showing 12,072 changed files with 1,704,664 additions and 935,575 deletions.
The diff you're trying to view is too large. We only load the first 3000 changed files.
52 changes: 32 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -376,61 +376,73 @@ to ensure backward compatibility. This testing consists of
These tests are run locally as described in [Test testcases from dataload to UI](#test-testcases-from-dataload-to-ui).
Ensure that the environment variables `NEO4J_HOME`, `SOLR_HOME` and `OLS4_HOME` are set up accordingly.

1. First make sure all the OLS4 JARs are up to date by running :
1. Before running your testcases, ensure that your work is already commited. Create a new branch based on the branch you worked on
but with a `-testcases` suffix. I.e., if your branch is called "fix-xyz", the new branch for the testcases will be
`fix-xyz-testcases`. We commit testcases to a separate branch due to the large number of files updated when testcases are run.

2. First make sure all the OLS4 JARs are up to date by running :

mvn clean package

2. Generate new output files and import into Neo4J and Solr:
3. Generate new output files and import into Neo4J and Solr:

./dev-testing/teststack.sh ./testcases ./testcases_output

3. Compare `/testcases_output` with `/testcases_expected_output`:
4. Compare `/testcases_output` with `/testcases_expected_output`:

./compare_testcase_output.sh

4. The output of step 3 is written to `testcases_compare_result.log`. If no differences are found, this file will be empty.
Ensure that all differences in this file can be explained and that they do make sense.
5. The output of step 3 is written to `testcases_compare_result.log`. If no differences are found, this file will be empty.
`testcases_compare_result.log` will only tell you which files are different. To see the actual differences in dicated in
`testcases_compare_result.log`, compare files that are stated to be different in a visual editor like Meld. Ensure that
all differences in this file can be explained and that they do make sense.

5. Once you are happy with the output in `testcases_output`, remove the old `testcases_expected_output` and replace with
6. Once you are happy with the output in `testcases_output`, remove the old `testcases_expected_output` and replace with
new expected output:

rm -rf testcases_expected_output
cp -r testcases_output/testcases testcases_expected_output

6. Now continue with API testing.
7. Add updated expected output to git.

git add -A testcases_expected_output

8. Commit the updates to testcases to a branch with suffix `-testcases`.
9. Now continue with API testing.

### Testing API
Before doing API testing you must have completed the [dataload testing](#testing-dataload).

7. Start the backend:
10. Before running the API tests, create a new branch with suffix `api-tests`. I.e., if the branch you worked on was
`fix-xyz-api-tests`.

11. Start the backend:

./dev-testing/start-backend.sh

8. Run API tests against backend using:
12. Run API tests against backend using:

./test_api_fast.sh http://localhost:8080 ./testcases_output_api ./testcases_expected_output_api --deep
./test_api_fast.sh http://localhost:8080 ./testcases_output_api ./testcases_expected_output_api --deep

9. The results of step 8 is written to `./apitester4.log`. Differences are written to the end of the file. When there are no
13. The results of step 8 is written to `./apitester4.log`. Differences are written to the end of the file. When there are no
differences, this file will end with these lines:

RecursiveJsonDiff.diff() reported success
apitester reported success; exit code 0
RecursiveJsonDiff.diff() reported success
apitester reported success; exit code 0

10. Ensure that all differences listed in `./apitester4.log` are accounted for. Once you are happy with the output, remove
14. Ensure that all differences listed in `./apitester4.log` are accounted for. Once you are happy with the output, remove
the old `testcases_expected_output_api` and replace with new expected output:

rm -rf testcases_expected_output_api
cp -r testcases_output testcases_expected_output_api
cp -r testcases_output_api testcases_expected_output_api

11. Add the latest expected outputs to Git:
15. Add the latest expected outputs to Git:

git add -A testcases_expected_output
git add -A testcases_expected_output_api

**You should do this in the same commit as your code/test changes because then we can track exactly
what changed in the output.**
16. Commit the API tests to a branch with suffix `-api-tests`.

12. You can stop the OLS4 backend with "Ctrl-C", and Solr and Neo4J with:
17. You can stop the OLS4 backend with "Ctrl-C", and Solr and Neo4J with:

./dev-testing/stopNeo4JSolr.sh

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,6 @@

import org.springframework.web.bind.annotation.RestController;
import uk.ac.ebi.spot.ols.repository.Validation;
import uk.ac.ebi.spot.ols.repository.neo4j.OlsNeo4jClient;
import uk.ac.ebi.spot.ols.repository.solr.OlsSolrClient;
import uk.ac.ebi.spot.ols.repository.transforms.LocalizationTransform;
import uk.ac.ebi.spot.ols.repository.transforms.RemoveLiteralDatatypesTransform;
Expand Down Expand Up @@ -92,8 +91,10 @@ public void search(
String[] fields = {"label_s", "synonym_s", "short_form_s", "obo_id_s", "iri_s", "annotations_trimmed"};
solrQuery.setQuery(
"((" +
createUnionQuery(query.toLowerCase(), SolrFieldMapper.mapFieldsList(List.of(fields)).toArray(new String[0]), true)
+ ") AND (isDefiningOntology:\"true\"^100 OR isDefiningOntology:false^0))"
createUnionQuery(query.toLowerCase(), SolrFieldMapper.mapFieldsList(List.of(fields))
.toArray(new String[0]), true)
+ ") AND ("+ IS_DEFINING_ONTOLOGY.getText() + ":\"true\"^100 OR " +
IS_DEFINING_ONTOLOGY.getText() + ":false^0))"
);

} else {
Expand All @@ -106,13 +107,14 @@ public void search(
solrQuery.set("qf", String.join(" ", SolrFieldMapper.mapFieldsList(List.of(fields))));

solrQuery.set("bq",
"isDefiningOntology:\"true\"^100 " +
IS_DEFINING_ONTOLOGY.getText() + ":\"true\"^100 " +
"lowercase_label:\"" + query.toLowerCase() + "\"^5 " +
"lowercase_synonym:\"" + query.toLowerCase() + "\"^3");
}
} else {
if (exact) {
String[] fields = SolrFieldMapper.mapFieldsList(queryFields.stream().map(queryField -> queryField + "_s").collect(Collectors.toList())).toArray(new String[0]);
String[] fields = SolrFieldMapper.mapFieldsList(queryFields.stream().map(queryField -> queryField + "_s")
.collect(Collectors.toList())).toArray(new String[0]);
solrQuery.setQuery(createUnionQuery(query.toLowerCase(), fields, true));
} else {

Expand Down Expand Up @@ -140,7 +142,7 @@ public void search(
}

if (isLocal) {
solrQuery.addFilterQuery("imported:false");
solrQuery.addFilterQuery(IMPORTED.getText() + ":false");
}

if (isLeaf) {
Expand Down Expand Up @@ -207,7 +209,7 @@ public void search(
"ontologyIri",
"ontologyPreferredPrefix",
"type",
"isDefiningOntology",
IS_DEFINING_ONTOLOGY.getText(),
IS_OBSOLETE.getText());
/*
* Fix: End
Expand Down Expand Up @@ -259,8 +261,9 @@ public void search(
if (fieldList.contains("description")) outDoc.put("description", JsonHelper.getStrings(json, "definition"));
if (fieldList.contains("short_form")) outDoc.put("short_form", JsonHelper.getString(json, "shortForm"));
if (fieldList.contains("obo_id")) outDoc.put("obo_id", JsonHelper.getString(json, "curie"));
if (fieldList.contains("is_defining_ontology")) outDoc.put("is_defining_ontology",
JsonHelper.getString(json, "isDefiningOntology") != null && JsonHelper.getString(json, "isDefiningOntology").equals("true"));
if (fieldList.contains(IS_DEFINING_ONTOLOGY.getOls3Text())) outDoc.put(IS_DEFINING_ONTOLOGY.getOls3Text(),
JsonHelper.getString(json, IS_DEFINING_ONTOLOGY.getText()) != null &&
JsonHelper.getString(json, IS_DEFINING_ONTOLOGY.getText()).equals("true"));
if (fieldList.contains("type")) {
outDoc.put("type", JsonHelper.getType(json, "type"));
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import uk.ac.ebi.spot.ols.repository.Validation;
import uk.ac.ebi.spot.ols.repository.neo4j.OlsNeo4jClient;
import uk.ac.ebi.spot.ols.repository.solr.OlsSolrClient;
import uk.ac.ebi.spot.ols.repository.transforms.LocalizationTransform;
import uk.ac.ebi.spot.ols.repository.transforms.RemoveLiteralDatatypesTransform;
Expand Down Expand Up @@ -72,7 +71,9 @@ public void select(
solrQuery.setQuery(query);
solrQuery.set("defType", "edismax");
solrQuery.set("qf", "label whitespace_edge_label synonym whitespace_edge_synonym shortForm whitespace_edge_shortForm curie iri");
solrQuery.set("bq", "type:ontology^10.0 isDefiningOntology:true^100.0 str_label:\"" + queryLc + "\"^1000 edge_label:\"" + queryLc + "\"^500 str_synonym:\"" + queryLc + "\" edge_synonym:\"" + queryLc + "\"^100");
solrQuery.set("bq", "type:ontology^10.0 " +
IS_DEFINING_ONTOLOGY.getText() +":true^100.0 str_label:\"" + queryLc + "\"^1000 edge_label:\"" +
queryLc + "\"^500 str_synonym:\"" + queryLc + "\" edge_synonym:\"" + queryLc + "\"^100");
solrQuery.set("wt", "json");

solrQuery.setFields("_json", "id");
Expand All @@ -94,7 +95,7 @@ public void select(
}

if (isLocal) {
solrQuery.addFilterQuery("isDefiningOntology:true");
solrQuery.addFilterQuery(IS_DEFINING_ONTOLOGY.getText() + ":true");
}

if (childrenOf != null) {
Expand Down Expand Up @@ -163,8 +164,9 @@ public void select(
if (fieldList.contains("description")) outDoc.put("description", JsonHelper.getStrings(json, "definition"));
if (fieldList.contains("short_form")) outDoc.put("short_form", JsonHelper.getString(json, "shortForm"));
if (fieldList.contains("obo_id")) outDoc.put("obo_id", JsonHelper.getString(json, "curie"));
if (fieldList.contains("is_defining_ontology")) outDoc.put("is_defining_ontology",
JsonHelper.getString(json, "isDefiningOntology") != null && JsonHelper.getString(json, "isDefiningOntology").equals("true"));
if (fieldList.contains(IS_DEFINING_ONTOLOGY.getOls3Text())) outDoc.put(IS_DEFINING_ONTOLOGY.getOls3Text(),
JsonHelper.getString(json, IS_DEFINING_ONTOLOGY.getText()) != null &&
JsonHelper.getString(json, IS_DEFINING_ONTOLOGY.getText()).equals("true"));
if (fieldList.contains("type")) {
outDoc.put("type", JsonHelper.getType(json, "type"));
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,10 +101,8 @@ public HttpEntity<V2Entity> getIndividual(
}


// The instances of classes are individuals. So, the /instances endpoint is part of the Class controller.
//
@RequestMapping(path = "/ontologies/{onto}/classes/{class}/instances", produces = {MediaType.APPLICATION_JSON_VALUE }, method = RequestMethod.GET)
public HttpEntity<V2PagedResponse<V2Entity>> getClassInstances(
@RequestMapping(path = "/ontologies/{onto}/classes/{class}/individuals", produces = {MediaType.APPLICATION_JSON_VALUE }, method = RequestMethod.GET)
public HttpEntity<V2PagedResponse<V2Entity>> getClassIndividuals(
@PageableDefault(size = 20, page = 0) Pageable pageable,
@PathVariable("onto") String ontologyId,
@PathVariable("class") String classIri,
Expand All @@ -115,7 +113,7 @@ public HttpEntity<V2PagedResponse<V2Entity>> getClassInstances(

return new ResponseEntity<>(
new V2PagedResponse<>(
individualRepository.getInstancesOfClass(ontologyId, classIri, pageable, lang)
individualRepository.getIndividualsOfClass(ontologyId, classIri, pageable, lang)
),
HttpStatus.OK);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@ public static String getString(JsonObject json, String key) {
return objectToString(json.get(key));
}

public static boolean getBoolean(JsonObject json, String key) {
return json.getAsJsonPrimitive(key).getAsBoolean();
}

public static String objectToString(JsonElement value) {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
import java.nio.charset.StandardCharsets;
import java.util.*;
import java.util.stream.Collectors;
import static uk.ac.ebi.ols.shared.DefinedFields.*;

public class V1AncestorsJsTreeBuilder {

Expand Down Expand Up @@ -90,8 +91,8 @@ private void createJsTreeEntries(List<Map<String,Object>> jstree, JsonObject ent
boolean opened = (!selected);


boolean hasDirectChildren = Objects.equals(JsonHelper.getString(entity, "hasDirectChildren"), "true");
boolean hasHierarchicalChildren = Objects.equals(JsonHelper.getString(entity, "hasHierarchicalChildren"), "true");
boolean hasDirectChildren = Objects.equals(JsonHelper.getString(entity, HAS_DIRECT_CHILDREN.getText()), "true");
boolean hasHierarchicalChildren = Objects.equals(JsonHelper.getString(entity, HAS_DIRECT_CHILDREN.getText()), "true");

// only nodes that aren't already opened are marked as having children, (iff they actually have children!)
boolean children = (!opened) && (hasDirectChildren || hasHierarchicalChildren);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

import java.nio.charset.StandardCharsets;
import java.util.*;
import static uk.ac.ebi.ols.shared.DefinedFields.*;

public class V1ChildrenJsTreeBuilder {

Expand Down Expand Up @@ -34,7 +35,7 @@ List<Map<String,Object>> buildJsTree() {
jstreeEntry.put("text", JsonHelper.getString(child.getAsJsonObject(), "label"));
jstreeEntry.put("state", Map.of("opened", false));
jstreeEntry.put("children",
JsonHelper.getString(child.getAsJsonObject(), "hasDirectChildren").equals("true")
JsonHelper.getString(child.getAsJsonObject(), HAS_DIRECT_CHILDREN.getText()).equals("true")
|| JsonHelper.getString(child.getAsJsonObject(), "hasHierarchicalChildren").equals("true")
);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@
import uk.ac.ebi.spot.ols.repository.solr.OlsFacetedResultsPage;
import uk.ac.ebi.spot.ols.repository.v1.mappers.V1IndividualMapper;
import uk.ac.ebi.spot.ols.repository.v1.mappers.V1TermMapper;
import static uk.ac.ebi.ols.shared.DefinedFields.*;

//@RepositoryRestResource(collectionResourceRel = "individuals", exported = false)
@Component
Expand Down Expand Up @@ -116,7 +117,7 @@ public OlsFacetedResultsPage<V1Individual> findAllByIsDefiningOntology(String la

OlsSolrQuery query = new OlsSolrQuery();
query.addFilter("type", List.of("individual"), SearchType.WHOLE_FIELD);
query.addFilter("isDefiningOntology", List.of("true"), SearchType.WHOLE_FIELD);
query.addFilter(IS_DEFINING_ONTOLOGY.getText(), List.of("true"), SearchType.WHOLE_FIELD);

OlsFacetedResultsPage<JsonElement> entities = solrClient.searchSolrPaginated(query, pageable);

Expand Down Expand Up @@ -145,7 +146,7 @@ public OlsFacetedResultsPage<V1Individual> findAllByIriAndIsDefiningOntology(Str

OlsSolrQuery query = new OlsSolrQuery();
query.addFilter("type", List.of("individual"), SearchType.WHOLE_FIELD);
query.addFilter("isDefiningOntology", List.of("true"), SearchType.WHOLE_FIELD);
query.addFilter(IS_DEFINING_ONTOLOGY.getText(), List.of("true"), SearchType.WHOLE_FIELD);
query.addFilter("iri", List.of(iri), SearchType.WHOLE_FIELD);

OlsFacetedResultsPage<JsonElement> entities = solrClient.searchSolrPaginated(query, pageable);
Expand Down Expand Up @@ -176,7 +177,7 @@ public OlsFacetedResultsPage<V1Individual> findAllByShortFormAndIsDefiningOntolo

OlsSolrQuery query = new OlsSolrQuery();
query.addFilter("type", List.of("individual"), SearchType.WHOLE_FIELD);
query.addFilter("isDefiningOntology", List.of("true"), SearchType.WHOLE_FIELD);
query.addFilter(IS_DEFINING_ONTOLOGY.getText(), List.of("true"), SearchType.WHOLE_FIELD);
query.addFilter("shortForm", List.of(shortForm), SearchType.WHOLE_FIELD);

OlsFacetedResultsPage<JsonElement> entities = solrClient.searchSolrPaginated(query, pageable);
Expand Down Expand Up @@ -204,7 +205,7 @@ public OlsFacetedResultsPage<V1Individual> findAllByOboIdAndIsDefiningOntology(S

OlsSolrQuery query = new OlsSolrQuery();
query.addFilter("type", List.of("individual"), SearchType.WHOLE_FIELD);
query.addFilter("isDefiningOntology", List.of("true"), SearchType.WHOLE_FIELD);
query.addFilter(IS_DEFINING_ONTOLOGY.getText(), List.of("true"), SearchType.WHOLE_FIELD);
query.addFilter("oboId", List.of(oboId), SearchType.WHOLE_FIELD);

OlsFacetedResultsPage<JsonElement> entities = solrClient.searchSolrPaginated(query, pageable);
Expand Down
Loading

0 comments on commit 606eb62

Please sign in to comment.