Im trying to do some performance testing on you application and it means im constantly swapping out data in and out of our DB.
To do this I use a small script in python that uses the LOAD csv command and point to a public csv file stored in GCS.
However trying it recently it seems like AuraDB may be caching the file because it keeps loading data that no longer exists.
My query looks along the lines of :
USING PERIODIC COMMIT 500 LOAD CSV WITH HEADERS FROM 'https://storage.googleapis.com/my-bucket/my-csv' AS row WITH row WHERE row.userID IS NOT NULL AND row.projectID IS NOT NULL AND row.environmentID IS NOT NULL MERGE (user:User {userID: row.userID}) ON CREATE SET user.projectID = row.projectID, user.environmentID = row.environmentID;
This seems to be picking up old data when i check the graph afterwards. Are you guys doing some level of caching?