mirror of https://gitee.com/bigwinds/arangodb
issue #736: AQL function to parse collection and key from document handle
Conflicts: CHANGELOG
This commit is contained in:
parent
622edb0fee
commit
583878176b
111
CHANGELOG
111
CHANGELOG
|
@ -1,6 +1,117 @@
|
|||
v1.5.0 (XXXX-XX-XX)
|
||||
-------------------
|
||||
|
||||
* issue #738: added __dirname, __filename pseudo-globals. Fixes #733. (@by pluma)
|
||||
|
||||
* allow `\n` (as well as `\r\n`) as line terminator in batch requests sent to
|
||||
`/_api/batch` HTTP API.
|
||||
|
||||
* use `--data-binary` instead of `--data` parameter in generated cURL examples
|
||||
|
||||
* issue #703: Also show path of logfile for fm.config()
|
||||
|
||||
* issue #675: Dropping a collection used in "graph" module breaks the graph
|
||||
|
||||
* added "static" Graph.drop() method for graphs API
|
||||
|
||||
* fixed issue #695: arangosh server.password error
|
||||
|
||||
* use pretty-printing in `--console` mode by defaul
|
||||
|
||||
* added `check-server` binary for testing
|
||||
|
||||
* simplified ArangoDB startup options
|
||||
|
||||
Some startup options are now superfluous or their usage is simplified. The
|
||||
following options have been changed:
|
||||
|
||||
* `--javascript.modules-path`: this option has been removed. The modules paths
|
||||
are determined by arangod and arangosh automatically based on the value of
|
||||
`--javascript.startup-directory`.
|
||||
|
||||
If the option is set on startup, it is ignored so startup will not abort with
|
||||
an error `unrecognized option`.
|
||||
|
||||
* `--javascript.action-directory`: this option has been removed. The actions
|
||||
directory is determined by arangod automatically based on the value of
|
||||
`--javascript.startup-directory`.
|
||||
|
||||
If the option is set on startup, it is ignored so startup will not abort with
|
||||
an error `unrecognized option`.
|
||||
|
||||
* `--javascript.package-path`: this option is still available but it is not
|
||||
required anymore to set the standard package paths (e.g. `js/npm`). arangod
|
||||
will automatically use this standard package path regardless of whether it
|
||||
was specified via the options.
|
||||
|
||||
It is possible to use this option to add additional package paths to the
|
||||
standard value.
|
||||
|
||||
Configuration files included with arangod are adjusted accordingly.
|
||||
|
||||
* layout of the graphs tab adapted to better fit with the other tabs
|
||||
|
||||
* database selection is moved to the bottom right corner of the web interface
|
||||
|
||||
* removed priority queues
|
||||
|
||||
this feature was never advertised nor documented nor tested.
|
||||
|
||||
* display internal attributes in document source view of web interface
|
||||
|
||||
* removed separate shape collections
|
||||
|
||||
When upgrading to ArangoDB 1.5, existing collections will be converted to include
|
||||
shapes and attribute markers in the datafiles instead of using separate files for
|
||||
shapes.
|
||||
|
||||
When a collection is converted, existing shapes from the SHAPES directory will
|
||||
be written to a new datafile in the collection directory, and the SHAPES directory
|
||||
will be removed afterwards.
|
||||
|
||||
This saves up to 2 MB of memory and disk space for each collection
|
||||
(savings are higher, the less different shapes there are in a collection).
|
||||
Additionally, one less file descriptor per opened collection will be used.
|
||||
|
||||
When creating a new collection, the amount of sync calls may be reduced. The same
|
||||
may be true for documents with yet-unknown shapes. This may help performance
|
||||
in these cases.
|
||||
|
||||
* added AQL functions `NTH` and `POSITION`
|
||||
|
||||
* added signal handler for arangosh to save last command in more cases
|
||||
|
||||
* added extra prompt placeholders for arangosh:
|
||||
- `%e`: current endpoint
|
||||
- `%u`: current user
|
||||
|
||||
* added arangosh option `--javascript.gc-interval` to control amount of
|
||||
garbage collection performed by arangosh
|
||||
|
||||
* fixed issue #651: Allow addEdge() to take vertex ids in the JS library
|
||||
|
||||
* removed command-line option `--log.format`
|
||||
|
||||
In previous versions, this option did not have an effect for most log messages, so
|
||||
it got removed.
|
||||
|
||||
* removed C++ logger implementation
|
||||
|
||||
Logging inside ArangoDB is now done using the LOG_XXX() macros. The LOGGER_XXX()
|
||||
macros are gone.
|
||||
|
||||
* added collection status "loading"
|
||||
|
||||
* added the option to return the number of elements indexed to the
|
||||
result of <collection>.getIndexes() for each index. This is
|
||||
currently only implemented for hash indices and skiplist indices.
|
||||
|
||||
|
||||
v1.4.6 (XXXX-XX-XX)
|
||||
-------------------
|
||||
|
||||
* issue #736: AQL function to parse collection and key from document handle
|
||||
|
||||
* added fm.rescan() method for Foxx-Manager
|
||||
|
||||
* fixed issue #734: foxx cookie and route problem
|
||||
|
|
|
@ -1249,6 +1249,19 @@ AQL supports the following functions to operate on document values:
|
|||
|
||||
RETURN KEEP(doc, 'firstname', 'name', 'likes')
|
||||
|
||||
- @FN{PARSE_IDENTIFIER(@FA{document-handle})}: parses the document handle specified in
|
||||
@FA{document-handle} and returns a the handle's individual parts a separate attributes.
|
||||
This function can be used to easily determine the collection name and key from a given document.
|
||||
The @FA{document-handle} can either be a regular document from a collection, or a document
|
||||
identifier string (e.g. `_users/1234`). Passing either a non-string or a non-document or a
|
||||
document without an `_id` attribute will result in an error.
|
||||
|
||||
RETURN PARSE_IDENTIFIER('_users/my-user')
|
||||
[ { "collection" : "_users", "key" : "my-user" } ]
|
||||
|
||||
RETURN PARSE_IDENTIFIER({ "_id" : "mycollection/mykey", "value" : "some value" })
|
||||
[ { "collection" : "mycollection", "key" : "mykey" } ]
|
||||
|
||||
@subsubsection AqlFunctionsGeo Geo functions
|
||||
|
||||
AQL offers the following functions to filter data based on geo indexes:
|
||||
|
|
|
@ -712,6 +712,7 @@ TRI_associative_pointer_t* TRI_CreateFunctionsAql (void) {
|
|||
REGISTER_FUNCTION("NOT_NULL", "NOT_NULL", true, false, ".|+", NULL);
|
||||
REGISTER_FUNCTION("FIRST_LIST", "FIRST_LIST", true, false, ".|+", NULL);
|
||||
REGISTER_FUNCTION("FIRST_DOCUMENT", "FIRST_DOCUMENT", true, false, ".|+", NULL);
|
||||
REGISTER_FUNCTION("PARSE_IDENTIFIER", "PARSE_IDENTIFIER", true, false, ".", NULL);
|
||||
|
||||
if (! result) {
|
||||
TRI_FreeFunctionsAql(functions);
|
||||
|
|
|
@ -3217,6 +3217,36 @@ function FIRST_DOCUMENT () {
|
|||
return null;
|
||||
}
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
/// @brief return the parts of a document identifier separately
|
||||
///
|
||||
/// returns a document with the attributes `collection` and `key` or fails if
|
||||
/// the individual parts cannot be determined.
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
function PARSE_IDENTIFIER (value) {
|
||||
"use strict";
|
||||
|
||||
if (TYPEWEIGHT(value) === TYPEWEIGHT_STRING) {
|
||||
var parts = value.split('/');
|
||||
if (parts.length === 2) {
|
||||
return {
|
||||
collection: parts[0],
|
||||
key: parts[1]
|
||||
};
|
||||
}
|
||||
// fall through intentional
|
||||
}
|
||||
else if (TYPEWEIGHT(value) === TYPEWEIGHT_DOCUMENT) {
|
||||
if (value.hasOwnProperty('_id')) {
|
||||
return PARSE_IDENTIFIER(value._id);
|
||||
}
|
||||
// fall through intentional
|
||||
}
|
||||
|
||||
THROW(INTERNAL.errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH, "PARSE_IDENTIFIER");
|
||||
}
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
/// @brief check whether a document has a specific attribute
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
@ -4009,6 +4039,7 @@ exports.GRAPH_NEIGHBORS = GRAPH_NEIGHBORS;
|
|||
exports.NOT_NULL = NOT_NULL;
|
||||
exports.FIRST_LIST = FIRST_LIST;
|
||||
exports.FIRST_DOCUMENT = FIRST_DOCUMENT;
|
||||
exports.PARSE_IDENTIFIER = PARSE_IDENTIFIER;
|
||||
exports.HAS = HAS;
|
||||
exports.ATTRIBUTES = ATTRIBUTES;
|
||||
exports.UNSET = UNSET;
|
||||
|
|
|
@ -1726,6 +1726,90 @@ function ahuacatlFunctionsTestSuite () {
|
|||
assertEqual(expected, actual);
|
||||
},
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
/// @brief test parse identifier function
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
testParseIdentifier : function () {
|
||||
var actual;
|
||||
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER('foo/bar')");
|
||||
assertEqual([ { collection: 'foo', key: 'bar' } ], actual);
|
||||
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER('this-is-a-collection-name/and-this-is-an-id')");
|
||||
assertEqual([ { collection: 'this-is-a-collection-name', key: 'and-this-is-an-id' } ], actual);
|
||||
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER('MY_COLLECTION/MY_DOC')");
|
||||
assertEqual([ { collection: 'MY_COLLECTION', key: 'MY_DOC' } ], actual);
|
||||
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER('_users/AbC')");
|
||||
assertEqual([ { collection: '_users', key: 'AbC' } ], actual);
|
||||
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER({ _id: 'foo/bar', value: 'baz' })");
|
||||
assertEqual([ { collection: 'foo', key: 'bar' } ], actual);
|
||||
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER({ ignore: true, _id: '_system/VALUE', value: 'baz' })");
|
||||
assertEqual([ { collection: '_system', key: 'VALUE' } ], actual);
|
||||
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER({ value: 123, _id: 'Some-Odd-Collection/THIS_IS_THE_KEY' })");
|
||||
assertEqual([ { collection: 'Some-Odd-Collection', key: 'THIS_IS_THE_KEY' } ], actual);
|
||||
},
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
/// @brief test parse identifier function
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
testParseIdentifierCollection : function () {
|
||||
var cn = "UnitTestsAhuacatlFunctions";
|
||||
|
||||
internal.db._drop(cn);
|
||||
var cx = internal.db._create(cn);
|
||||
cx.save({ "title" : "123", "value" : 456, "_key" : "foobar" });
|
||||
cx.save({ "_key" : "so-this-is-it", "title" : "nada", "value" : 123 });
|
||||
|
||||
var expected, actual;
|
||||
|
||||
expected = [ { collection: cn, key: "foobar" } ];
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT(CONCAT(@cn, '/', @key)))", { cn: cn, key: "foobar" });
|
||||
assertEqual(expected, actual);
|
||||
|
||||
expected = [ { collection: cn, key: "foobar" } ];
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT(CONCAT(@cn, '/', @key)))", { cn: cn, key: "foobar" });
|
||||
assertEqual(expected, actual);
|
||||
|
||||
expected = [ { collection: cn, key: "foobar" } ];
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT(CONCAT(@cn, '/', 'foobar')))", { cn: cn });
|
||||
assertEqual(expected, actual);
|
||||
|
||||
expected = [ { collection: cn, key: "foobar" } ];
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT([ @key ])[0])", { key: "UnitTestsAhuacatlFunctions/foobar" });
|
||||
assertEqual(expected, actual);
|
||||
|
||||
expected = [ { collection: cn, key: "so-this-is-it" } ];
|
||||
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT([ 'UnitTestsAhuacatlFunctions/so-this-is-it' ])[0])");
|
||||
assertEqual(expected, actual);
|
||||
|
||||
internal.db._drop(cn);
|
||||
},
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
/// @brief test parse identifier function
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
testParseIdentifier : function () {
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_NUMBER_MISMATCH.code, "RETURN PARSE_IDENTIFIER()");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_NUMBER_MISMATCH.code, "RETURN PARSE_IDENTIFIER('foo', 'bar')");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER(null)");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER(false)");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER(3)");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER(\"foo\")");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER('foo bar')");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER('foo/bar/baz')");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER([ ])");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER({ })");
|
||||
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER({ foo: 'bar' })");
|
||||
},
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
/// @brief test document function
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
|
Loading…
Reference in New Issue