1
0
Fork 0

Merge branch 'sharding' of ssh://github.com/triAGENS/ArangoDB into sharding

This commit is contained in:
Max Neunhoeffer 2014-01-21 09:57:26 +01:00
commit eec3c08f87
36 changed files with 1387 additions and 410 deletions

View File

@ -1,6 +1,16 @@
v1.5.0 (XXXX-XX-XX) v1.5.0 (XXXX-XX-XX)
------------------- -------------------
* allow direct access from the `db` object to collections whose names start
with an underscore (e.g. db._users).
Previously, access to such collections via the `db` object was possible from
arangosh, but not from arangod (and thus Foxx and actions). The only way
to access such collections from these places was via the `db._collection(<name>)`
workaround.
* issue #738: added __dirname, __filename pseudo-globals. Fixes #733. (@by pluma)
* allow `\n` (as well as `\r\n`) as line terminator in batch requests sent to * allow `\n` (as well as `\r\n`) as line terminator in batch requests sent to
`/_api/batch` HTTP API. `/_api/batch` HTTP API.
@ -108,6 +118,12 @@ v1.5.0 (XXXX-XX-XX)
v1.4.6 (XXXX-XX-XX) v1.4.6 (XXXX-XX-XX)
------------------- -------------------
* issue #736: AQL function to parse collection and key from document handle
* added fm.rescan() method for Foxx-Manager
* fixed issue #734: foxx cookie and route problem
* added method `fm.configJson` for arangosh * added method `fm.configJson` for arangosh
* include `startupPath` in result of API `/_api/foxx/config` * include `startupPath` in result of API `/_api/foxx/config`

View File

@ -7,4 +7,4 @@ server: triagens GmbH High-Performance HTTP Server
connection: Keep-Alive connection: Keep-Alive
content-type: application/json; charset=utf-8 content-type: application/json; charset=utf-8
{"error":false,"created":2,"errors":0} {"error":false,"created":2,"empty":0,"errors":0}

View File

@ -8,4 +8,4 @@ server: triagens GmbH High-Performance HTTP Server
connection: Keep-Alive connection: Keep-Alive
content-type: application/json; charset=utf-8 content-type: application/json; charset=utf-8
{"error":false,"created":2,"errors":0} {"error":false,"created":2,"empty":0,"errors":0}

View File

@ -93,7 +93,14 @@ the data are line-wise JSON documents (type = documents) or a JSON list (type =
The server will respond with an HTTP 201 if everything went well. The number of The server will respond with an HTTP 201 if everything went well. The number of
documents imported will be returned in the `created` attribute of the documents imported will be returned in the `created` attribute of the
response. If any documents were skipped or incorrectly formatted, this will be response. If any documents were skipped or incorrectly formatted, this will be
returned in the `errors` attribute. returned in the `errors` attribute. There will also be an attribute `empty` in
the response, which will contain a value of `0`.
If the `details` parameter was set to `true` in the request, the response will
also contain an attribute `details` which is a list of details about errors that
occurred on the server side during the import. This list might be empty if no
errors occurred.
Importing Headers and Values {#HttpImportHeaderData} Importing Headers and Values {#HttpImportHeaderData}
==================================================== ====================================================
@ -112,7 +119,13 @@ are needed or allowed in this data section.
The server will again respond with an HTTP 201 if everything went well. The The server will again respond with an HTTP 201 if everything went well. The
number of documents imported will be returned in the `created` attribute of the number of documents imported will be returned in the `created` attribute of the
response. If any documents were skipped or incorrectly formatted, this will be response. If any documents were skipped or incorrectly formatted, this will be
returned in the `errors` attribute. returned in the `errors` attribute. The number of empty lines in the input file
will be returned in the `empty` attribute.
If the `details` parameter was set to `true` in the request, the response will
also contain an attribute `details` which is a list of details about errors that
occurred on the server side during the import. This list might be empty if no
errors occurred.
Importing into Edge Collections {#HttpImportEdges} Importing into Edge Collections {#HttpImportEdges}
================================================== ==================================================

View File

@ -52,7 +52,8 @@ specify a password, you will be prompted for one.
Note that the collection (`users` in this case) must already exist or the import Note that the collection (`users` in this case) must already exist or the import
will fail. If you want to create a new collection with the import data, you need will fail. If you want to create a new collection with the import data, you need
to specify the `--create-collection` option. Note that it is only possible to to specify the `--create-collection` option. Note that it is only possible to
create a document collection using the `--create-collection` flag. create a document collection using the `--create-collection` flag, and no edge
collections.
unix> arangoimp --file "data.json" --type json --collection "users" --create-collection true unix> arangoimp --file "data.json" --type json --collection "users" --create-collection true
@ -65,6 +66,18 @@ Please note that by default, _arangoimp_ will import data into the specified
collection in the default database (`_system`). To specify a different database, collection in the default database (`_system`). To specify a different database,
use the `--server.database` option when invoking _arangoimp_. use the `--server.database` option when invoking _arangoimp_.
An _arangoimp_ import will print out the final results on the command line.
By default, it shows the number of documents created, the number of errors that
occurred on the server side, and the total number of input file lines/documents
that it processed. Additionally, _arangoimp_ will print out details about errors
that happended on the server-side (if any).
Example:
created: 2
errors: 0
total: 2
Importing CSV Data {#ImpManualCsv} Importing CSV Data {#ImpManualCsv}
================================== ==================================
@ -114,3 +127,50 @@ with the `--separator` argument.
An example command line to execute the TSV import is: An example command line to execute the TSV import is:
unix> arangoimp --file "data.tsv" --type tsv --collection "users" unix> arangoimp --file "data.tsv" --type tsv --collection "users"
Importing into an Edge Collection {#ImpManualEdges}
===================================================
arangoimp can also be used to import data into an existing edge collection.
The import data must, for each edge to import, contain at least the `_from` and
`_to` attributes. These indicate which other two documents the edge should connect.
It is necessary that these attributes are set for all records, and point to
valid document ids in existing collections.
Example:
{ "_from" : "users/1234", "_to" : "users/4321", "desc" : "1234 is connected to 4321" }
Note that the edge collection must already exist when the import is started. Using
the `--create-collection` flag will not work because arangoimp will always try to
create a regular document collection if the target collection does not exist.
Attribute Naming and Special Attributes {#ImpManualAttributes}
==============================================================
Attributes whose names start with an underscore are treated in a special way by
ArangoDB:
- the optional `_key` attribute contains the document's key. If specified, the value
must be formally valid (e.g. must be a string and conform to the naming conventions
for @ref DocumentKeys). Additionally, the key value must be unique within the
collection the import is run for.
- `_from`: when importing into an edge collection, this attribute contains the id
of one of the documents connected by the edge. The value of `_from` must be a
syntactially valid document id and the referred collection must exist.
- `_to`: when importing into an edge collection, this attribute contains the id
of the other document connected by the edge. The value of `_to` must be a
syntactially valid document id and the referred collection must exist.
- `_rev`: this attribute contains the revision number of a document. However, the
revision numbers are managed by ArangoDB and cannot be specified on import. Thus
any value in this attribute is ignored on import.
- all other attributes starting with an underscore are discarded on import without
any warnings.
If you import values into `_key`, you should make sure they are valid and unique.
When importing data into an edge collection, you should make sure that all import
documents can `_from` and `_to` and that their values point to existing documents.
Finally you should make sure that all other attributes in the import file do not
start with an underscore - otherwise they might be discarded.

View File

@ -5,3 +5,5 @@ TOC {#ImpManualTOC}
- @ref ImpManualJson - @ref ImpManualJson
- @ref ImpManualCsv - @ref ImpManualCsv
- @ref ImpManualTsv - @ref ImpManualTsv
- @ref ImpManualEdges
- @ref ImpManualAttributes

View File

@ -1259,6 +1259,19 @@ AQL supports the following functions to operate on document values:
RETURN KEEP(doc, 'firstname', 'name', 'likes') RETURN KEEP(doc, 'firstname', 'name', 'likes')
- @FN{PARSE_IDENTIFIER(@FA{document-handle})}: parses the document handle specified in
@FA{document-handle} and returns a the handle's individual parts a separate attributes.
This function can be used to easily determine the collection name and key from a given document.
The @FA{document-handle} can either be a regular document from a collection, or a document
identifier string (e.g. `_users/1234`). Passing either a non-string or a non-document or a
document without an `_id` attribute will result in an error.
RETURN PARSE_IDENTIFIER('_users/my-user')
[ { "collection" : "_users", "key" : "my-user" } ]
RETURN PARSE_IDENTIFIER({ "_id" : "mycollection/mykey", "value" : "some value" })
[ { "collection" : "mycollection", "key" : "mykey" } ]
@subsubsection AqlFunctionsGeo Geo functions @subsubsection AqlFunctionsGeo Geo functions
AQL offers the following functions to filter data based on geo indexes: AQL offers the following functions to filter data based on geo indexes:

View File

@ -714,6 +714,7 @@ TRI_associative_pointer_t* TRI_CreateFunctionsAql (void) {
REGISTER_FUNCTION("NOT_NULL", "NOT_NULL", true, false, ".|+", NULL); REGISTER_FUNCTION("NOT_NULL", "NOT_NULL", true, false, ".|+", NULL);
REGISTER_FUNCTION("FIRST_LIST", "FIRST_LIST", true, false, ".|+", NULL); REGISTER_FUNCTION("FIRST_LIST", "FIRST_LIST", true, false, ".|+", NULL);
REGISTER_FUNCTION("FIRST_DOCUMENT", "FIRST_DOCUMENT", true, false, ".|+", NULL); REGISTER_FUNCTION("FIRST_DOCUMENT", "FIRST_DOCUMENT", true, false, ".|+", NULL);
REGISTER_FUNCTION("PARSE_IDENTIFIER", "PARSE_IDENTIFIER", true, false, ".", NULL);
if (! result) { if (! result) {
TRI_FreeFunctionsAql(functions); TRI_FreeFunctionsAql(functions);

View File

@ -441,18 +441,23 @@ void ClusterInfo::loadCurrentDatabases () {
/// Usually one does not have to call this directly. /// Usually one does not have to call this directly.
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
void ClusterInfo::loadPlannedCollections () { void ClusterInfo::loadPlannedCollections (bool acquireLock) {
static const std::string prefix = "Plan/Collections"; static const std::string prefix = "Plan/Collections";
AgencyCommResult result; AgencyCommResult result;
{ {
if (acquireLock) {
AgencyCommLocker locker("Plan", "READ"); AgencyCommLocker locker("Plan", "READ");
if (locker.successful()) { if (locker.successful()) {
result = _agency.getValues(prefix, true); result = _agency.getValues(prefix, true);
} }
} }
else {
result = _agency.getValues(prefix, true);
}
}
if (result.successful()) { if (result.successful()) {
result.parse(prefix + "/", false); result.parse(prefix + "/", false);
@ -529,7 +534,7 @@ CollectionInfo ClusterInfo::getCollection (DatabaseID const& databaseID,
int tries = 0; int tries = 0;
if (! _collectionsValid) { if (! _collectionsValid) {
loadPlannedCollections(); loadPlannedCollections(true);
++tries; ++tries;
} }
@ -550,7 +555,7 @@ CollectionInfo ClusterInfo::getCollection (DatabaseID const& databaseID,
} }
// must load collections outside the lock // must load collections outside the lock
loadPlannedCollections(); loadPlannedCollections(true);
} }
return CollectionInfo(); return CollectionInfo();
@ -599,7 +604,7 @@ const std::vector<CollectionInfo> ClusterInfo::getCollections (DatabaseID const&
std::vector<CollectionInfo> result; std::vector<CollectionInfo> result;
// always reload // always reload
loadPlannedCollections(); loadPlannedCollections(true);
READ_LOCKER(_lock); READ_LOCKER(_lock);
// look up database by id // look up database by id
@ -810,10 +815,29 @@ int ClusterInfo::createCollectionCoordinator (string const& databaseName,
{ {
AgencyCommLocker locker("Plan", "WRITE"); AgencyCommLocker locker("Plan", "WRITE");
if (! locker.successful()) { if (! locker.successful()) {
return setErrormsg(TRI_ERROR_CLUSTER_COULD_NOT_LOCK_PLAN, errorMsg); return setErrormsg(TRI_ERROR_CLUSTER_COULD_NOT_LOCK_PLAN, errorMsg);
} }
{
// check if a collection with the same name is already planned
loadPlannedCollections(false);
READ_LOCKER(_lock);
AllCollections::const_iterator it = _collections.find(databaseName);
if (it != _collections.end()) {
const std::string name = JsonHelper::getStringValue(json, "name", "");
DatabaseCollections::const_iterator it2 = (*it).second.find(name);
if (it2 != (*it).second.end()) {
// collection already exists!
return TRI_ERROR_ARANGO_DUPLICATE_NAME;
}
}
}
if (! ac.exists("Plan/Databases/" + databaseName)) { if (! ac.exists("Plan/Databases/" + databaseName)) {
return setErrormsg(TRI_ERROR_ARANGO_DATABASE_NOT_FOUND, errorMsg); return setErrormsg(TRI_ERROR_ARANGO_DATABASE_NOT_FOUND, errorMsg);
} }
@ -1139,7 +1163,7 @@ ServerID ClusterInfo::getResponsibleServer (ShardID const& shardID) {
int tries = 0; int tries = 0;
if (! _collectionsValid) { if (! _collectionsValid) {
loadPlannedCollections(); loadPlannedCollections(true);
tries++; tries++;
} }
@ -1154,7 +1178,7 @@ ServerID ClusterInfo::getResponsibleServer (ShardID const& shardID) {
} }
// must load collections outside the lock // must load collections outside the lock
loadPlannedCollections(); loadPlannedCollections(true);
} }
return ServerID(""); return ServerID("");

View File

@ -315,7 +315,7 @@ namespace triagens {
/// Usually one does not have to call this directly. /// Usually one does not have to call this directly.
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
void loadPlannedCollections (); void loadPlannedCollections (bool = true);
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief flushes the list of planned databases /// @brief flushes the list of planned databases

View File

@ -244,14 +244,11 @@ int RestImportHandler::handleSingleDocument (ImportTransactionType& trx,
/// @RESTQUERYPARAM{type,string,required} /// @RESTQUERYPARAM{type,string,required}
/// Determines how the body of the request will be interpreted. `type` can have /// Determines how the body of the request will be interpreted. `type` can have
/// the following values: /// the following values:
///
/// - `documents`: when this type is used, each line in the request body is /// - `documents`: when this type is used, each line in the request body is
/// expected to be an individual JSON-encoded document. Multiple JSON documents /// expected to be an individual JSON-encoded document. Multiple JSON documents
/// in the request body need to be separated by newlines. /// in the request body need to be separated by newlines.
///
/// - `list`: when this type is used, the request body must contain a single /// - `list`: when this type is used, the request body must contain a single
/// JSON-encoded list of individual documents to import. /// JSON-encoded list of individual documents to import.
///
/// - `auto`: if set, this will automatically determine the body type (either /// - `auto`: if set, this will automatically determine the body type (either
/// `documents` or `list`). /// `documents` or `list`).
/// ///
@ -736,8 +733,9 @@ bool RestImportHandler::createFromJson (const string& type) {
/// ///
/// @RESTBODYPARAM{documents,string,required} /// @RESTBODYPARAM{documents,string,required}
/// The body must consist of JSON-encoded lists of attribute values, with one /// The body must consist of JSON-encoded lists of attribute values, with one
/// line per per document. The first line of the request must be a JSON-encoded /// line per per document. The first row of the request must be a JSON-encoded
/// list of attribute names. /// list of attribute names. These attribute names are used for the data in the
/// subsequent rows.
/// ///
/// @RESTQUERYPARAMETERS /// @RESTQUERYPARAMETERS
/// ///

View File

@ -2082,11 +2082,6 @@ static v8::Handle<v8::Value> EnsureGeoIndexVocbaseCol (v8::Arguments const& argv
TRI_primary_collection_t* primary = collection->_collection; TRI_primary_collection_t* primary = collection->_collection;
if (! TRI_IS_DOCUMENT_COLLECTION(collection->_type)) {
ReleaseCollection(collection);
TRI_V8_EXCEPTION_INTERNAL(scope, "unknown collection type");
}
TRI_document_collection_t* document = (TRI_document_collection_t*) primary; TRI_document_collection_t* document = (TRI_document_collection_t*) primary;
TRI_index_t* idx = 0; TRI_index_t* idx = 0;
bool created; bool created;
@ -4440,11 +4435,6 @@ static v8::Handle<v8::Value> JS_UpgradeVocbaseCol (v8::Arguments const& argv) {
TRI_primary_collection_t* primary = collection->_collection; TRI_primary_collection_t* primary = collection->_collection;
if (! TRI_IS_DOCUMENT_COLLECTION(collection->_type)) {
ReleaseCollection(collection);
TRI_V8_EXCEPTION_INTERNAL(scope, "unknown collection type");
}
TRI_collection_t* col = &primary->base; TRI_collection_t* col = &primary->base;
#ifdef TRI_ENABLE_LOGGER #ifdef TRI_ENABLE_LOGGER
@ -5156,11 +5146,6 @@ static v8::Handle<v8::Value> JS_DropIndexVocbaseCol (v8::Arguments const& argv)
TRI_primary_collection_t* primary = collection->_collection; TRI_primary_collection_t* primary = collection->_collection;
if (! TRI_IS_DOCUMENT_COLLECTION(collection->_type)) {
ReleaseCollection(collection);
TRI_V8_EXCEPTION_INTERNAL(scope, "unknown collection type");
}
TRI_document_collection_t* document = (TRI_document_collection_t*) primary; TRI_document_collection_t* document = (TRI_document_collection_t*) primary;
if (argv.Length() != 1) { if (argv.Length() != 1) {
@ -5242,11 +5227,6 @@ static v8::Handle<v8::Value> JS_EnsureCapConstraintVocbaseCol (v8::Arguments con
TRI_primary_collection_t* primary = collection->_collection; TRI_primary_collection_t* primary = collection->_collection;
if (! TRI_IS_DOCUMENT_COLLECTION(collection->_type)) {
ReleaseCollection(collection);
TRI_V8_EXCEPTION_INTERNAL(scope, "unknown collection type");
}
TRI_document_collection_t* document = (TRI_document_collection_t*) primary; TRI_document_collection_t* document = (TRI_document_collection_t*) primary;
TRI_index_t* idx = 0; TRI_index_t* idx = 0;
bool created; bool created;
@ -5343,11 +5323,6 @@ static v8::Handle<v8::Value> EnsureBitarray (v8::Arguments const& argv, bool sup
TRI_primary_collection_t* primary = collection->_collection; TRI_primary_collection_t* primary = collection->_collection;
if (! TRI_IS_DOCUMENT_COLLECTION(collection->_type)) {
ReleaseCollection(collection);
TRI_V8_EXCEPTION_INTERNAL(scope, "unknown collection type");
}
TRI_document_collection_t* document = (TRI_document_collection_t*) primary; TRI_document_collection_t* document = (TRI_document_collection_t*) primary;
// ............................................................................. // .............................................................................
@ -6307,11 +6282,6 @@ static v8::Handle<v8::Value> JS_PropertiesVocbaseCol (v8::Arguments const& argv)
TRI_primary_collection_t* primary = collection->_collection; TRI_primary_collection_t* primary = collection->_collection;
TRI_collection_t* base = &primary->base; TRI_collection_t* base = &primary->base;
if (! TRI_IS_DOCUMENT_COLLECTION(base->_info._type)) {
ReleaseCollection(collection);
TRI_V8_EXCEPTION_INTERNAL(scope, "unknown collection type");
}
TRI_document_collection_t* document = (TRI_document_collection_t*) primary; TRI_document_collection_t* document = (TRI_document_collection_t*) primary;
// check if we want to change some parameters // check if we want to change some parameters
@ -6391,7 +6361,6 @@ static v8::Handle<v8::Value> JS_PropertiesVocbaseCol (v8::Arguments const& argv)
// return the current parameter set // return the current parameter set
v8::Handle<v8::Object> result = v8::Object::New(); v8::Handle<v8::Object> result = v8::Object::New();
if (TRI_IS_DOCUMENT_COLLECTION(base->_info._type)) {
result->Set(v8g->DoCompactKey, base->_info._doCompact ? v8::True() : v8::False()); result->Set(v8g->DoCompactKey, base->_info._doCompact ? v8::True() : v8::False());
result->Set(v8g->IsSystemKey, base->_info._isSystem ? v8::True() : v8::False()); result->Set(v8g->IsSystemKey, base->_info._isSystem ? v8::True() : v8::False());
result->Set(v8g->IsVolatileKey, base->_info._isVolatile ? v8::True() : v8::False()); result->Set(v8g->IsVolatileKey, base->_info._isVolatile ? v8::True() : v8::False());
@ -6408,7 +6377,6 @@ static v8::Handle<v8::Value> JS_PropertiesVocbaseCol (v8::Arguments const& argv)
result->Set(v8g->KeyOptionsKey, v8::Array::New()); result->Set(v8g->KeyOptionsKey, v8::Array::New());
} }
result->Set(v8g->WaitForSyncKey, base->_info._waitForSync ? v8::True() : v8::False()); result->Set(v8g->WaitForSyncKey, base->_info._waitForSync ? v8::True() : v8::False());
}
ReleaseCollection(collection); ReleaseCollection(collection);
return scope.Close(result); return scope.Close(result);
@ -6675,12 +6643,6 @@ static v8::Handle<v8::Value> JS_RotateVocbaseCol (v8::Arguments const& argv) {
} }
TRI_primary_collection_t* primary = collection->_collection; TRI_primary_collection_t* primary = collection->_collection;
TRI_collection_t* base = &primary->base;
if (! TRI_IS_DOCUMENT_COLLECTION(base->_info._type)) {
ReleaseCollection(collection);
TRI_V8_EXCEPTION_INTERNAL(scope, "unknown collection type");
}
TRI_document_collection_t* document = (TRI_document_collection_t*) primary; TRI_document_collection_t* document = (TRI_document_collection_t*) primary;
@ -7324,7 +7286,7 @@ static v8::Handle<v8::Value> MapGetVocBase (v8::Local<v8::String> name,
return scope.Close(v8::Handle<v8::Value>()); return scope.Close(v8::Handle<v8::Value>());
} }
if (*key == '_' || // hide system collections if (*key == '_' ||
strcmp(key, "hasOwnProperty") == 0 || // this prevents calling the property getter again (i.e. recursion!) strcmp(key, "hasOwnProperty") == 0 || // this prevents calling the property getter again (i.e. recursion!)
strcmp(key, "toString") == 0 || strcmp(key, "toString") == 0 ||
strcmp(key, "toJSON") == 0) { strcmp(key, "toJSON") == 0) {
@ -7338,7 +7300,6 @@ static v8::Handle<v8::Value> MapGetVocBase (v8::Local<v8::String> name,
cacheKey.push_back('*'); cacheKey.push_back('*');
v8::Local<v8::String> cacheName = v8::String::New(cacheKey.c_str(), cacheKey.size()); v8::Local<v8::String> cacheName = v8::String::New(cacheKey.c_str(), cacheKey.size());
v8::Handle<v8::Object> holder = info.Holder()->ToObject(); v8::Handle<v8::Object> holder = info.Holder()->ToObject();
if (holder->HasRealNamedProperty(cacheName)) { if (holder->HasRealNamedProperty(cacheName)) {
@ -7397,10 +7358,13 @@ static v8::Handle<v8::Value> MapGetVocBase (v8::Local<v8::String> name,
#endif #endif
if (collection == 0) { if (collection == 0) {
return scope.Close(v8::Undefined()); if (*key == '_') {
// we need to do this here...
// otherwise we'd hide all non-collection attributes such as
// db._drop
return scope.Close(v8::Handle<v8::Value>());
} }
if (! TRI_IS_DOCUMENT_COLLECTION(collection->_type)) {
return scope.Close(v8::Undefined()); return scope.Close(v8::Undefined());
} }
@ -8825,14 +8789,15 @@ static v8::Handle<v8::Value> MapGetNamedShapedJson (v8::Local<v8::String> name,
v8::Handle<v8::Object> self = info.Holder(); v8::Handle<v8::Object> self = info.Holder();
if (self->InternalFieldCount() <= SLOT_BARRIER) { if (self->InternalFieldCount() <= SLOT_BARRIER) {
TRI_V8_EXCEPTION_INTERNAL(scope, "corrupted shaped json"); // we better not throw here... otherwise this will cause a segfault
return scope.Close(v8::Handle<v8::Value>());
} }
// get shaped json // get shaped json
void* marker = TRI_UnwrapClass<void*>(self, WRP_SHAPED_JSON_TYPE); void* marker = TRI_UnwrapClass<void*>(self, WRP_SHAPED_JSON_TYPE);
if (marker == 0) { if (marker == 0) {
TRI_V8_EXCEPTION_INTERNAL(scope, "corrupted shaped json"); return scope.Close(v8::Handle<v8::Value>());
} }
// convert the JavaScript string to a string // convert the JavaScript string to a string

View File

@ -357,11 +357,6 @@ bool TRI_LoadAuthInfo (TRI_vocbase_t* vocbase) {
LOG_FATAL_AND_EXIT("collection '_users' cannot be loaded"); LOG_FATAL_AND_EXIT("collection '_users' cannot be loaded");
} }
if (! TRI_IS_DOCUMENT_COLLECTION(primary->base._info._type)) {
TRI_ReleaseCollectionVocBase(vocbase, collection);
LOG_FATAL_AND_EXIT("collection '_users' has an unknown collection type");
}
TRI_WriteLockReadWriteLock(&vocbase->_authInfoLock); TRI_WriteLockReadWriteLock(&vocbase->_authInfoLock);
// ............................................................................. // .............................................................................

View File

@ -249,7 +249,6 @@ void TRI_CleanupVocBase (void* data) {
// check if we can get the compactor lock exclusively // check if we can get the compactor lock exclusively
if (TRI_CheckAndLockCompactorVocBase(vocbase)) { if (TRI_CheckAndLockCompactorVocBase(vocbase)) {
size_t i, n; size_t i, n;
TRI_col_type_e type;
// copy all collections // copy all collections
TRI_READ_LOCK_COLLECTIONS_VOCBASE(vocbase); TRI_READ_LOCK_COLLECTIONS_VOCBASE(vocbase);
@ -261,6 +260,7 @@ void TRI_CleanupVocBase (void* data) {
for (i = 0; i < n; ++i) { for (i = 0; i < n; ++i) {
TRI_vocbase_col_t* collection; TRI_vocbase_col_t* collection;
TRI_primary_collection_t* primary; TRI_primary_collection_t* primary;
TRI_document_collection_t* document;
collection = (TRI_vocbase_col_t*) collections._buffer[i]; collection = (TRI_vocbase_col_t*) collections._buffer[i];
@ -273,16 +273,13 @@ void TRI_CleanupVocBase (void* data) {
continue; continue;
} }
type = primary->base._info._type;
TRI_READ_UNLOCK_STATUS_VOCBASE_COL(collection); TRI_READ_UNLOCK_STATUS_VOCBASE_COL(collection);
// we're the only ones that can unload the collection, so using // we're the only ones that can unload the collection, so using
// the collection pointer outside the lock is ok // the collection pointer outside the lock is ok
// maybe cleanup indexes, unload the collection or some datafiles // maybe cleanup indexes, unload the collection or some datafiles
if (TRI_IS_DOCUMENT_COLLECTION(type)) { document = (TRI_document_collection_t*) primary;
TRI_document_collection_t* document = (TRI_document_collection_t*) primary;
// clean indexes? // clean indexes?
if (iterations % (uint64_t) CLEANUP_INDEX_ITERATIONS == 0) { if (iterations % (uint64_t) CLEANUP_INDEX_ITERATIONS == 0) {
@ -291,7 +288,6 @@ void TRI_CleanupVocBase (void* data) {
CleanupDocumentCollection(document); CleanupDocumentCollection(document);
} }
}
TRI_UnlockCompactorVocBase(vocbase); TRI_UnlockCompactorVocBase(vocbase);
} }

View File

@ -1077,15 +1077,12 @@ char* TRI_GetDirectoryCollection (char const* path,
TRI_col_type_e type, TRI_col_type_e type,
TRI_voc_cid_t cid) { TRI_voc_cid_t cid) {
char* filename; char* filename;
assert(path);
assert(name);
// other collections use the collection identifier
if (TRI_IS_DOCUMENT_COLLECTION(type)) {
char* tmp1; char* tmp1;
char* tmp2; char* tmp2;
assert(path != NULL);
assert(name != NULL);
tmp1 = TRI_StringUInt64(cid); tmp1 = TRI_StringUInt64(cid);
if (tmp1 == NULL) { if (tmp1 == NULL) {
@ -1107,12 +1104,6 @@ char* TRI_GetDirectoryCollection (char const* path,
filename = TRI_Concatenate2File(path, tmp2); filename = TRI_Concatenate2File(path, tmp2);
TRI_FreeString(TRI_CORE_MEM_ZONE, tmp1); TRI_FreeString(TRI_CORE_MEM_ZONE, tmp1);
TRI_FreeString(TRI_CORE_MEM_ZONE, tmp2); TRI_FreeString(TRI_CORE_MEM_ZONE, tmp2);
}
// oops, unknown collection type
else {
TRI_set_errno(TRI_ERROR_ARANGO_UNKNOWN_COLLECTION_TYPE);
return NULL;
}
if (filename == NULL) { if (filename == NULL) {
TRI_set_errno(TRI_ERROR_OUT_OF_MEMORY); TRI_set_errno(TRI_ERROR_OUT_OF_MEMORY);
@ -1610,9 +1601,7 @@ int TRI_UpdateCollectionInfo (TRI_vocbase_t* vocbase,
TRI_collection_t* collection, TRI_collection_t* collection,
TRI_col_info_t const* parameter) { TRI_col_info_t const* parameter) {
if (TRI_IS_DOCUMENT_COLLECTION(collection->_info._type)) {
TRI_LOCK_JOURNAL_ENTRIES_DOC_COLLECTION((TRI_document_collection_t*) collection); TRI_LOCK_JOURNAL_ENTRIES_DOC_COLLECTION((TRI_document_collection_t*) collection);
}
if (parameter != NULL) { if (parameter != NULL) {
collection->_info._doCompact = parameter->_doCompact; collection->_info._doCompact = parameter->_doCompact;
@ -1629,9 +1618,7 @@ int TRI_UpdateCollectionInfo (TRI_vocbase_t* vocbase,
// ... probably a few others missing here ... // ... probably a few others missing here ...
} }
if (TRI_IS_DOCUMENT_COLLECTION(collection->_info._type)) {
TRI_UNLOCK_JOURNAL_ENTRIES_DOC_COLLECTION((TRI_document_collection_t*) collection); TRI_UNLOCK_JOURNAL_ENTRIES_DOC_COLLECTION((TRI_document_collection_t*) collection);
}
return TRI_SaveCollectionInfo(collection->_directory, &collection->_info, vocbase->_settings.forceSyncProperties); return TRI_SaveCollectionInfo(collection->_directory, &collection->_info, vocbase->_settings.forceSyncProperties);
} }

View File

@ -142,26 +142,6 @@ struct TRI_vocbase_col_s;
/// @} /// @}
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
// -----------------------------------------------------------------------------
// --SECTION-- public macros
// -----------------------------------------------------------------------------
////////////////////////////////////////////////////////////////////////////////
/// @addtogroup VocBase
/// @{
////////////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////////////
/// @brief return whether the collection is a document collection
////////////////////////////////////////////////////////////////////////////////
#define TRI_IS_DOCUMENT_COLLECTION(type) \
((type) == TRI_COL_TYPE_DOCUMENT || (type) == TRI_COL_TYPE_EDGE)
////////////////////////////////////////////////////////////////////////////////
/// @}
////////////////////////////////////////////////////////////////////////////////
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
// --SECTION-- public types // --SECTION-- public types
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------

View File

@ -1471,7 +1471,6 @@ void TRI_CompactorVocBase (void* data) {
for (i = 0; i < n; ++i) { for (i = 0; i < n; ++i) {
TRI_vocbase_col_t* collection; TRI_vocbase_col_t* collection;
TRI_primary_collection_t* primary; TRI_primary_collection_t* primary;
TRI_col_type_e type;
bool doCompact; bool doCompact;
bool worked; bool worked;
@ -1492,10 +1491,8 @@ void TRI_CompactorVocBase (void* data) {
worked = false; worked = false;
doCompact = primary->base._info._doCompact; doCompact = primary->base._info._doCompact;
type = primary->base._info._type;
// for document collection, compactify datafiles // for document collection, compactify datafiles
if (TRI_IS_DOCUMENT_COLLECTION(type)) {
if (collection->_status == TRI_VOC_COL_STATUS_LOADED && doCompact) { if (collection->_status == TRI_VOC_COL_STATUS_LOADED && doCompact) {
TRI_barrier_t* ce; TRI_barrier_t* ce;
@ -1521,7 +1518,6 @@ void TRI_CompactorVocBase (void* data) {
// read-unlock the compaction lock // read-unlock the compaction lock
TRI_WriteUnlockReadWriteLock(&primary->_compactionLock); TRI_WriteUnlockReadWriteLock(&primary->_compactionLock);
} }
}
TRI_READ_UNLOCK_STATUS_VOCBASE_COL(collection); TRI_READ_UNLOCK_STATUS_VOCBASE_COL(collection);

View File

@ -344,11 +344,6 @@ TRI_index_t* TRI_LookupIndex (TRI_primary_collection_t* primary,
TRI_index_t* idx; TRI_index_t* idx;
size_t i; size_t i;
if (! TRI_IS_DOCUMENT_COLLECTION(primary->base._info._type)) {
TRI_set_errno(TRI_ERROR_ARANGO_UNKNOWN_COLLECTION_TYPE);
return NULL;
}
doc = (TRI_document_collection_t*) primary; doc = (TRI_document_collection_t*) primary;
for (i = 0; i < doc->_allIndexes._length; ++i) { for (i = 0; i < doc->_allIndexes._length; ++i) {

View File

@ -227,7 +227,6 @@ static bool CheckJournalDocumentCollection (TRI_document_collection_t* document)
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
void TRI_SynchroniserVocBase (void* data) { void TRI_SynchroniserVocBase (void* data) {
TRI_col_type_e type;
TRI_vocbase_t* vocbase = data; TRI_vocbase_t* vocbase = data;
TRI_vector_pointer_t collections; TRI_vector_pointer_t collections;
@ -256,6 +255,7 @@ void TRI_SynchroniserVocBase (void* data) {
for (i = 0; i < n; ++i) { for (i = 0; i < n; ++i) {
TRI_vocbase_col_t* collection; TRI_vocbase_col_t* collection;
TRI_primary_collection_t* primary; TRI_primary_collection_t* primary;
bool result;
collection = collections._buffer[i]; collection = collections._buffer[i];
@ -274,17 +274,11 @@ void TRI_SynchroniserVocBase (void* data) {
primary = collection->_collection; primary = collection->_collection;
// for document collection, first sync and then seal // for document collection, first sync and then seal
type = primary->base._info._type;
if (TRI_IS_DOCUMENT_COLLECTION(type)) {
bool result;
result = CheckSyncDocumentCollection((TRI_document_collection_t*) primary); result = CheckSyncDocumentCollection((TRI_document_collection_t*) primary);
worked |= result; worked |= result;
result = CheckJournalDocumentCollection((TRI_document_collection_t*) primary); result = CheckJournalDocumentCollection((TRI_document_collection_t*) primary);
worked |= result; worked |= result;
}
TRI_READ_UNLOCK_STATUS_VOCBASE_COL(collection); TRI_READ_UNLOCK_STATUS_VOCBASE_COL(collection);
} }

View File

@ -257,17 +257,6 @@ static bool UnloadCollectionCallback (TRI_collection_t* col, void* data) {
return true; return true;
} }
if (! TRI_IS_DOCUMENT_COLLECTION(collection->_type)) {
LOG_ERROR("cannot unload collection '%s' of type '%d'",
collection->_name,
(int) collection->_type);
collection->_status = TRI_VOC_COL_STATUS_LOADED;
TRI_WRITE_UNLOCK_STATUS_VOCBASE_COL(collection);
return false;
}
if (TRI_ContainsBarrierList(&collection->_collection->_barrierList, TRI_BARRIER_ELEMENT) || if (TRI_ContainsBarrierList(&collection->_collection->_barrierList, TRI_BARRIER_ELEMENT) ||
TRI_ContainsBarrierList(&collection->_collection->_barrierList, TRI_BARRIER_COLLECTION_REPLICATION) || TRI_ContainsBarrierList(&collection->_collection->_barrierList, TRI_BARRIER_COLLECTION_REPLICATION) ||
TRI_ContainsBarrierList(&collection->_collection->_barrierList, TRI_BARRIER_COLLECTION_COMPACTION)) { TRI_ContainsBarrierList(&collection->_collection->_barrierList, TRI_BARRIER_COLLECTION_COMPACTION)) {
@ -348,17 +337,6 @@ static bool DropCollectionCallback (TRI_collection_t* col,
// ............................................................................. // .............................................................................
if (collection->_collection != NULL) { if (collection->_collection != NULL) {
if (! TRI_IS_DOCUMENT_COLLECTION(collection->_type)) {
LOG_ERROR("cannot drop collection '%s' of type %d",
collection->_name,
(int) collection->_type);
TRI_WRITE_UNLOCK_STATUS_VOCBASE_COL(collection);
regfree(&re);
return false;
}
document = (TRI_document_collection_t*) collection->_collection; document = (TRI_document_collection_t*) collection->_collection;
res = TRI_CloseDocumentCollection(document); res = TRI_CloseDocumentCollection(document);
@ -975,8 +953,6 @@ static int ScanPath (TRI_vocbase_t* vocbase,
else { else {
// we found a collection that is still active // we found a collection that is still active
TRI_col_type_e type = info._type; TRI_col_type_e type = info._type;
if (TRI_IS_DOCUMENT_COLLECTION(type)) {
TRI_vocbase_col_t* c; TRI_vocbase_col_t* c;
if (info._version < TRI_COL_VERSION) { if (info._version < TRI_COL_VERSION) {
@ -1042,10 +1018,6 @@ static int ScanPath (TRI_vocbase_t* vocbase,
LOG_DEBUG("added document collection from '%s'", file); LOG_DEBUG("added document collection from '%s'", file);
} }
else {
LOG_DEBUG("skipping collection of unknown type %d", (int) type);
}
}
TRI_FreeCollectionInfoOptions(&info); TRI_FreeCollectionInfoOptions(&info);
} }
else { else {
@ -1071,8 +1043,6 @@ static int ScanPath (TRI_vocbase_t* vocbase,
static int LoadCollectionVocBase (TRI_vocbase_t* vocbase, static int LoadCollectionVocBase (TRI_vocbase_t* vocbase,
TRI_vocbase_col_t* collection) { TRI_vocbase_col_t* collection) {
TRI_col_type_e type;
// ............................................................................. // .............................................................................
// read lock // read lock
// ............................................................................. // .............................................................................
@ -1165,9 +1135,6 @@ static int LoadCollectionVocBase (TRI_vocbase_t* vocbase,
// unloaded, load collection // unloaded, load collection
if (collection->_status == TRI_VOC_COL_STATUS_UNLOADED) { if (collection->_status == TRI_VOC_COL_STATUS_UNLOADED) {
type = (TRI_col_type_e) collection->_type;
if (TRI_IS_DOCUMENT_COLLECTION(type)) {
TRI_document_collection_t* document; TRI_document_collection_t* document;
// set the status to loading // set the status to loading
@ -1203,13 +1170,6 @@ static int LoadCollectionVocBase (TRI_vocbase_t* vocbase,
return LoadCollectionVocBase(vocbase, collection); return LoadCollectionVocBase(vocbase, collection);
} }
else {
LOG_ERROR("unknown collection type %d for '%s'", (int) type, collection->_name);
TRI_WRITE_UNLOCK_STATUS_VOCBASE_COL(collection);
return TRI_set_errno(TRI_ERROR_ARANGO_UNKNOWN_COLLECTION_TYPE);
}
}
LOG_ERROR("unknown collection status %d for '%s'", (int) collection->_status, collection->_name); LOG_ERROR("unknown collection status %d for '%s'", (int) collection->_status, collection->_name);
@ -1963,10 +1923,9 @@ TRI_vocbase_col_t* TRI_CreateCollectionVocBase (TRI_vocbase_t* vocbase,
TRI_voc_cid_t cid, TRI_voc_cid_t cid,
TRI_server_id_t generatingServer) { TRI_server_id_t generatingServer) {
TRI_vocbase_col_t* collection; TRI_vocbase_col_t* collection;
TRI_col_type_e type;
char* name; char* name;
assert(parameter); assert(parameter != NULL);
name = parameter->_name; name = parameter->_name;
// check that the name does not contain any strange characters // check that the name does not contain any strange characters
@ -1976,15 +1935,6 @@ TRI_vocbase_col_t* TRI_CreateCollectionVocBase (TRI_vocbase_t* vocbase,
return NULL; return NULL;
} }
type = (TRI_col_type_e) parameter->_type;
if (! TRI_IS_DOCUMENT_COLLECTION(type)) {
LOG_ERROR("unknown collection type: %d", (int) parameter->_type);
TRI_set_errno(TRI_ERROR_ARANGO_UNKNOWN_COLLECTION_TYPE);
return NULL;
}
TRI_ReadLockReadWriteLock(&vocbase->_inventoryLock); TRI_ReadLockReadWriteLock(&vocbase->_inventoryLock);
collection = CreateCollection(vocbase, parameter, cid, generatingServer); collection = CreateCollection(vocbase, parameter, cid, generatingServer);

View File

@ -303,7 +303,8 @@ int main (int argc, char* argv[]) {
BaseClient.sslProtocol(), BaseClient.sslProtocol(),
false); false);
if (! ClientConnection->isConnected() || ClientConnection->getLastHttpReturnCode() != HttpResponse::OK) { if (! ClientConnection->isConnected() ||
ClientConnection->getLastHttpReturnCode() != HttpResponse::OK) {
cerr << "Could not connect to endpoint '" << BaseClient.endpointServer()->getSpecification() cerr << "Could not connect to endpoint '" << BaseClient.endpointServer()->getSpecification()
<< "', database: '" << BaseClient.databaseName() << "'" << endl; << "', database: '" << BaseClient.databaseName() << "'" << endl;
cerr << "Error message: '" << ClientConnection->getErrorMessage() << "'" << endl; cerr << "Error message: '" << ClientConnection->getErrorMessage() << "'" << endl;
@ -358,18 +359,18 @@ int main (int argc, char* argv[]) {
// collection name // collection name
if (CollectionName == "") { if (CollectionName == "") {
cerr << "collection name is missing." << endl; cerr << "Collection name is missing." << endl;
TRI_EXIT_FUNCTION(EXIT_FAILURE, NULL); TRI_EXIT_FUNCTION(EXIT_FAILURE, NULL);
} }
// filename // filename
if (FileName == "") { if (FileName == "") {
cerr << "file name is missing." << endl; cerr << "File name is missing." << endl;
TRI_EXIT_FUNCTION(EXIT_FAILURE, NULL); TRI_EXIT_FUNCTION(EXIT_FAILURE, NULL);
} }
if (FileName != "-" && ! FileUtils::isRegularFile(FileName)) { if (FileName != "-" && ! FileUtils::isRegularFile(FileName)) {
cerr << "file '" << FileName << "' is not a regular file." << endl; cerr << "Cannot open file '" << FileName << "'" << endl;
TRI_EXIT_FUNCTION(EXIT_FAILURE, NULL); TRI_EXIT_FUNCTION(EXIT_FAILURE, NULL);
} }
@ -415,9 +416,6 @@ int main (int argc, char* argv[]) {
cerr << "error message: " << ih.getErrorMessage() << endl; cerr << "error message: " << ih.getErrorMessage() << endl;
} }
// calling dispose in V8 3.10.x causes a segfault. the v8 docs says its not necessary to call it upon program termination
// v8::V8::Dispose();
TRIAGENS_REST_SHUTDOWN; TRIAGENS_REST_SHUTDOWN;
arangoimpExitFunction(ret, NULL); arangoimpExitFunction(ret, NULL);

View File

@ -146,6 +146,24 @@ actions.defineHttp({
}) })
}); });
////////////////////////////////////////////////////////////////////////////////
/// @brief rescans the FOXX application directory
////////////////////////////////////////////////////////////////////////////////
actions.defineHttp({
url : "_admin/foxx/rescan",
context : "admin",
prefix : false,
callback: easyPostCallback({
body: true,
callback: function (body) {
foxxManager.scanAppDirectory();
return true;
}
})
});
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief sets up a FOXX application /// @brief sets up a FOXX application
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////

View File

@ -183,6 +183,7 @@
"ERROR_GRAPH_COULD_NOT_CREATE_EDGE" : { "code" : 1907, "message" : "could not create edge" }, "ERROR_GRAPH_COULD_NOT_CREATE_EDGE" : { "code" : 1907, "message" : "could not create edge" },
"ERROR_GRAPH_COULD_NOT_CHANGE_EDGE" : { "code" : 1908, "message" : "could not change edge" }, "ERROR_GRAPH_COULD_NOT_CHANGE_EDGE" : { "code" : 1908, "message" : "could not change edge" },
"ERROR_GRAPH_TOO_MANY_ITERATIONS" : { "code" : 1909, "message" : "too many iterations" }, "ERROR_GRAPH_TOO_MANY_ITERATIONS" : { "code" : 1909, "message" : "too many iterations" },
"ERROR_GRAPH_INVALID_FILTER_RESULT" : { "code" : 1910, "message" : "invalid filter result" },
"ERROR_SESSION_UNKNOWN" : { "code" : 1950, "message" : "unknown session" }, "ERROR_SESSION_UNKNOWN" : { "code" : 1950, "message" : "unknown session" },
"ERROR_SESSION_EXPIRED" : { "code" : 1951, "message" : "session expired" }, "ERROR_SESSION_EXPIRED" : { "code" : 1951, "message" : "session expired" },
"SIMPLE_CLIENT_UNKNOWN_ERROR" : { "code" : 2000, "message" : "unknown client error" }, "SIMPLE_CLIENT_UNKNOWN_ERROR" : { "code" : 2000, "message" : "unknown client error" },

View File

@ -31,6 +31,7 @@ module.define("org/arangodb/graph/traversal", function(exports, module) {
var graph = require("org/arangodb/graph"); var graph = require("org/arangodb/graph");
var arangodb = require("org/arangodb"); var arangodb = require("org/arangodb");
var BinaryHeap = require("org/arangodb/heap").BinaryHeap;
var ArangoError = arangodb.ArangoError; var ArangoError = arangodb.ArangoError;
var db = arangodb.db; var db = arangodb.db;
@ -38,9 +39,54 @@ var db = arangodb.db;
var ArangoTraverser; var ArangoTraverser;
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
// --SECTION-- public functions // --SECTION-- helper functions
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
////////////////////////////////////////////////////////////////////////////////
/// @brief clone any object
////////////////////////////////////////////////////////////////////////////////
function clone (obj) {
if (obj === null || typeof(obj) !== "object") {
return obj;
}
var copy, i;
if (Array.isArray(obj)) {
copy = [ ];
for (i = 0; i < obj.length; ++i) {
copy[i] = clone(obj[i]);
}
}
else if (obj instanceof Object) {
copy = { };
if (obj.hasOwnProperty) {
for (i in obj) {
if (obj.hasOwnProperty(i)) {
copy[i] = clone(obj[i]);
}
}
}
}
return copy;
}
////////////////////////////////////////////////////////////////////////////////
/// @brief traversal abortion exception
////////////////////////////////////////////////////////////////////////////////
var abortedException = function (message, options) {
'use strict';
this.message = message || "traversal intentionally aborted by user";
this.options = options || { };
this._intentionallyAborted = true;
};
abortedException.prototype = new Error();
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
// --SECTION-- datasources // --SECTION-- datasources
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
@ -366,35 +412,6 @@ function trackingVisitor (config, result, vertex, path) {
return; return;
} }
function clone (obj) {
if (obj === null || typeof(obj) !== "object") {
return obj;
}
var copy, i;
if (Array.isArray(obj)) {
copy = [ ];
for (i = 0; i < obj.length; ++i) {
copy[i] = clone(obj[i]);
}
}
else if (obj instanceof Object) {
copy = { };
if (obj.hasOwnProperty) {
for (i in obj) {
if (obj.hasOwnProperty(i)) {
copy[i] = clone(obj[i]);
}
}
}
}
return copy;
}
if (result.visited.vertices) { if (result.visited.vertices) {
result.visited.vertices.push(clone(vertex)); result.visited.vertices.push(clone(vertex));
} }
@ -555,7 +572,10 @@ function parseFilterResult (args) {
return; return;
} }
throw "invalid filter result"; var err = new ArangoError();
err.errorNum = arangodb.errors.ERROR_GRAPH_INVALID_FILTER_RESULT.code;
err.errorMessage = arangodb.errors.ERROR_GRAPH_INVALID_FILTER_RESULT.message;
throw err;
} }
processArgument(args); processArgument(args);
@ -629,6 +649,10 @@ function checkReverse (config) {
function breadthFirstSearch () { function breadthFirstSearch () {
return { return {
requiresEndVertex: function () {
return false;
},
getPathItems: function (id, items) { getPathItems: function (id, items) {
var visited = { }; var visited = { };
var ignore = items.length - 1; var ignore = items.length - 1;
@ -757,6 +781,10 @@ function breadthFirstSearch () {
function depthFirstSearch () { function depthFirstSearch () {
return { return {
requiresEndVertex: function () {
return false;
},
getPathItems: function (id, items) { getPathItems: function (id, items) {
var visited = { }; var visited = { };
items.forEach(function (item) { items.forEach(function (item) {
@ -854,6 +882,240 @@ function depthFirstSearch () {
}; };
} }
////////////////////////////////////////////////////////////////////////////////
/// @brief implementation details for dijkstra shortest path strategy
////////////////////////////////////////////////////////////////////////////////
function dijkstraSearch () {
return {
nodes: { },
requiresEndVertex: function () {
return true;
},
makeNode: function (vertex) {
var id = vertex._id;
if (! this.nodes.hasOwnProperty(id)) {
this.nodes[id] = { vertex: vertex, dist: Infinity };
}
return this.nodes[id];
},
vertexList: function (vertex) {
var result = [ ];
while (vertex) {
result.push(vertex);
vertex = vertex.parent;
}
return result;
},
buildPath: function (vertex) {
var path = { vertices: [ vertex.vertex ], edges: [ ] };
var v = vertex;
while (v.parent) {
path.vertices.unshift(v.parent.vertex);
path.edges.unshift(v.parentEdge);
v = v.parent;
}
return path;
},
run: function (config, result, startVertex, endVertex) {
var maxIterations = config.maxIterations, visitCounter = 0;
var heap = new BinaryHeap(function (node) {
return node.dist;
});
var startNode = this.makeNode(startVertex);
startNode.dist = 0;
heap.push(startNode);
while (heap.size() > 0) {
if (visitCounter++ > maxIterations) {
var err = new ArangoError();
err.errorNum = arangodb.errors.ERROR_GRAPH_TOO_MANY_ITERATIONS.code;
err.errorMessage = arangodb.errors.ERROR_GRAPH_TOO_MANY_ITERATIONS.message;
throw err;
}
var currentNode = heap.pop();
var i, n;
if (currentNode.vertex._id === endVertex._id) {
var vertices = this.vertexList(currentNode);
if (config.order !== ArangoTraverser.PRE_ORDER) {
vertices.reverse();
}
n = vertices.length;
for (i = 0; i < n; ++i) {
config.visitor(config, result, vertices[i].vertex, this.buildPath(vertices[i]));
}
return;
}
if (currentNode.visited) {
continue;
}
if (currentNode.dist === Infinity) {
break;
}
currentNode.visited = true;
var dist = currentNode.dist;
var path = this.buildPath(currentNode);
var connected = config.expander(config, currentNode.vertex, path);
n = connected.length;
for (i = 0; i < n; ++i) {
var neighbor = this.makeNode(connected[i].vertex);
if (neighbor.visited) {
continue;
}
var edge = connected[i].edge;
var weight = 1;
if (config.distance) {
weight = config.distance(config, currentNode.vertex, neighbor.vertex, edge);
}
var alt = dist + weight;
if (alt < neighbor.dist) {
neighbor.dist = alt;
neighbor.parent = currentNode;
neighbor.parentEdge = edge;
heap.push(neighbor);
}
}
}
}
};
}
////////////////////////////////////////////////////////////////////////////////
/// @brief implementation details for a* shortest path strategy
////////////////////////////////////////////////////////////////////////////////
function astarSearch () {
return {
nodes: { },
requiresEndVertex: function () {
return true;
},
makeNode: function (vertex) {
var id = vertex._id;
if (! this.nodes.hasOwnProperty(id)) {
this.nodes[id] = { vertex: vertex, f: 0, g: 0, h: 0 };
}
return this.nodes[id];
},
vertexList: function (vertex) {
var result = [ ];
while (vertex) {
result.push(vertex);
vertex = vertex.parent;
}
return result;
},
buildPath: function (vertex) {
var path = { vertices: [ vertex.vertex ], edges: [ ] };
var v = vertex;
while (v.parent) {
path.vertices.unshift(v.parent.vertex);
path.edges.unshift(v.parentEdge);
v = v.parent;
}
return path;
},
run: function (config, result, startVertex, endVertex) {
var maxIterations = config.maxIterations, visitCounter = 0;
var heap = new BinaryHeap(function (node) {
return node.f;
});
heap.push(this.makeNode(startVertex));
while (heap.size() > 0) {
if (visitCounter++ > maxIterations) {
var err = new ArangoError();
err.errorNum = arangodb.errors.ERROR_GRAPH_TOO_MANY_ITERATIONS.code;
err.errorMessage = arangodb.errors.ERROR_GRAPH_TOO_MANY_ITERATIONS.message;
throw err;
}
var currentNode = heap.pop();
var i, n;
if (currentNode.vertex._id === endVertex._id) {
var vertices = this.vertexList(currentNode);
if (config.order !== ArangoTraverser.PRE_ORDER) {
vertices.reverse();
}
n = vertices.length;
for (i = 0; i < n; ++i) {
config.visitor(config, result, vertices[i].vertex, this.buildPath(vertices[i]));
}
return;
}
currentNode.closed = true;
var path = this.buildPath(currentNode);
var connected = config.expander(config, currentNode.vertex, path);
n = connected.length;
for (i = 0; i < n; ++i) {
var neighbor = this.makeNode(connected[i].vertex);
if (neighbor.closed) {
continue;
}
var gScore = currentNode.g + 1;// + neighbor.cost;
var beenVisited = neighbor.visited;
if (! beenVisited || gScore < neighbor.g) {
var edge = connected[i].edge;
neighbor.visited = true;
neighbor.parent = currentNode;
neighbor.parentEdge = edge;
neighbor.h = 1;
if (config.distance && ! neighbor.h) {
neighbor.h = config.distance(config, neighbor.vertex, endVertex, edge);
}
neighbor.g = gScore;
neighbor.f = neighbor.g + neighbor.h;
if (! beenVisited) {
heap.push(neighbor);
}
else {
heap.rescoreElement(neighbor);
}
}
}
}
}
};
}
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @} /// @}
@ -959,7 +1221,9 @@ ArangoTraverser = function (config) {
config.strategy = validate(config.strategy, { config.strategy = validate(config.strategy, {
depthfirst: ArangoTraverser.DEPTH_FIRST, depthfirst: ArangoTraverser.DEPTH_FIRST,
breadthfirst: ArangoTraverser.BREADTH_FIRST breadthfirst: ArangoTraverser.BREADTH_FIRST,
astar: ArangoTraverser.ASTAR_SEARCH,
dijkstra: ArangoTraverser.DIJKSTRA_SEARCH
}, "strategy"); }, "strategy");
config.order = validate(config.order, { config.order = validate(config.order, {
@ -1054,23 +1318,54 @@ ArangoTraverser = function (config) {
/// @brief execute the traversal /// @brief execute the traversal
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
ArangoTraverser.prototype.traverse = function (result, startVertex) { ArangoTraverser.prototype.traverse = function (result, startVertex, endVertex) {
// check the start vertex
if (startVertex === undefined || startVertex === null) {
throw "invalid startVertex specified for traversal";
}
// get the traversal strategy // get the traversal strategy
var strategy; var strategy;
if (this.config.strategy === ArangoTraverser.BREADTH_FIRST) {
if (this.config.strategy === ArangoTraverser.ASTAR_SEARCH) {
strategy = astarSearch();
}
else if (this.config.strategy === ArangoTraverser.DIJKSTRA_SEARCH) {
strategy = dijkstraSearch();
}
else if (this.config.strategy === ArangoTraverser.BREADTH_FIRST) {
strategy = breadthFirstSearch(); strategy = breadthFirstSearch();
} }
else { else {
strategy = depthFirstSearch(); strategy = depthFirstSearch();
} }
// check the start vertex
if (startVertex === undefined ||
startVertex === null ||
typeof startVertex !== 'object') {
var err1 = new ArangoError();
err1.errorNum = arangodb.errors.ERROR_BAD_PARAMETER.code;
err1.errorMessage = arangodb.errors.ERROR_BAD_PARAMETER.message +
": invalid startVertex specified for traversal";
throw err1;
}
if (strategy.requiresEndVertex() &&
(endVertex === undefined ||
endVertex === null ||
typeof endVertex !== 'object')) {
var err2 = new ArangoError();
err2.errorNum = arangodb.errors.ERROR_BAD_PARAMETER.code;
err2.errorMessage = arangodb.errors.ERROR_BAD_PARAMETER.message +
": invalid endVertex specified for traversal";
throw err2;
}
// run the traversal // run the traversal
strategy.run(this.config, result, startVertex); try {
strategy.run(this.config, result, startVertex, endVertex);
}
catch (err3) {
if (typeof err3 !== "object" || ! err3._intentionallyAborted) {
throw err3;
}
}
}; };
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
@ -1116,6 +1411,18 @@ ArangoTraverser.BREADTH_FIRST = 0;
ArangoTraverser.DEPTH_FIRST = 1; ArangoTraverser.DEPTH_FIRST = 1;
////////////////////////////////////////////////////////////////////////////////
/// @brief astar search
////////////////////////////////////////////////////////////////////////////////
ArangoTraverser.ASTAR_SEARCH = 2;
////////////////////////////////////////////////////////////////////////////////
/// @brief dijkstra search
////////////////////////////////////////////////////////////////////////////////
ArangoTraverser.DIJKSTRA_SEARCH = 3;
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief pre-order traversal /// @brief pre-order traversal
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
@ -1181,6 +1488,7 @@ exports.visitAllFilter = visitAllFilter;
exports.maxDepthFilter = maxDepthFilter; exports.maxDepthFilter = maxDepthFilter;
exports.minDepthFilter = minDepthFilter; exports.minDepthFilter = minDepthFilter;
exports.includeMatchingAttributesFilter = includeMatchingAttributesFilter; exports.includeMatchingAttributesFilter = includeMatchingAttributesFilter;
exports.abortedException = abortedException;
exports.Traverser = ArangoTraverser; exports.Traverser = ArangoTraverser;

View File

@ -683,6 +683,9 @@ exports.run = function (args) {
exports.mount(args[1], args[2]); exports.mount(args[1], args[2]);
} }
} }
else if (type === 'rescan') {
exports.rescan();
}
else if (type === 'setup') { else if (type === 'setup') {
exports.setup(args[1]); exports.setup(args[1]);
} }
@ -821,6 +824,18 @@ exports.fetch = function (type, location, version) {
return arangosh.checkRequestResult(res); return arangosh.checkRequestResult(res);
}; };
////////////////////////////////////////////////////////////////////////////////
/// @brief rescans the FOXX application directory
////////////////////////////////////////////////////////////////////////////////
exports.rescan = function () {
'use strict';
var res = arango.POST("/_admin/foxx/rescan", "");
return arangosh.checkRequestResult(res);
};
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief mounts a FOXX application /// @brief mounts a FOXX application
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
@ -1435,6 +1450,8 @@ exports.help = function () {
"setup" : "setup executes the setup script (app must already be mounted)", "setup" : "setup executes the setup script (app must already be mounted)",
"install" : "fetches a foxx application from the central foxx-apps repository, mounts it to a local URL " + "install" : "fetches a foxx application from the central foxx-apps repository, mounts it to a local URL " +
"and sets it up", "and sets it up",
"rescan" : "rescans the foxx application directory on the server side (only needed if server-side apps " +
"directory is modified by other processes)",
"replace" : "replaces an aleady existing foxx application with the current local version", "replace" : "replaces an aleady existing foxx application with the current local version",
"teardown" : "teardown execute the teardown script (app must be still be mounted)", "teardown" : "teardown execute the teardown script (app must be still be mounted)",
"unmount" : "unmounts a mounted foxx application", "unmount" : "unmounts a mounted foxx application",

View File

@ -183,6 +183,7 @@
"ERROR_GRAPH_COULD_NOT_CREATE_EDGE" : { "code" : 1907, "message" : "could not create edge" }, "ERROR_GRAPH_COULD_NOT_CREATE_EDGE" : { "code" : 1907, "message" : "could not create edge" },
"ERROR_GRAPH_COULD_NOT_CHANGE_EDGE" : { "code" : 1908, "message" : "could not change edge" }, "ERROR_GRAPH_COULD_NOT_CHANGE_EDGE" : { "code" : 1908, "message" : "could not change edge" },
"ERROR_GRAPH_TOO_MANY_ITERATIONS" : { "code" : 1909, "message" : "too many iterations" }, "ERROR_GRAPH_TOO_MANY_ITERATIONS" : { "code" : 1909, "message" : "too many iterations" },
"ERROR_GRAPH_INVALID_FILTER_RESULT" : { "code" : 1910, "message" : "invalid filter result" },
"ERROR_SESSION_UNKNOWN" : { "code" : 1950, "message" : "unknown session" }, "ERROR_SESSION_UNKNOWN" : { "code" : 1950, "message" : "unknown session" },
"ERROR_SESSION_EXPIRED" : { "code" : 1951, "message" : "session expired" }, "ERROR_SESSION_EXPIRED" : { "code" : 1951, "message" : "session expired" },
"SIMPLE_CLIENT_UNKNOWN_ERROR" : { "code" : 2000, "message" : "unknown client error" }, "SIMPLE_CLIENT_UNKNOWN_ERROR" : { "code" : 2000, "message" : "unknown client error" },

View File

@ -735,6 +735,17 @@ function require (path) {
} }
} }
// actually the file name can be set via the path attribute
if (origin === undefined) {
origin = description.path;
}
// strip protocol (e.g. file://)
if (typeof origin === 'string') {
origin = origin.replace(/^[a-z]+:\/\//, '');
}
sandbox.__filename = origin;
sandbox.__dirname = typeof origin === 'string' ? origin.split('/').slice(0, -1).join('/') : origin;
sandbox.module = module; sandbox.module = module;
sandbox.exports = module.exports; sandbox.exports = module.exports;
sandbox.require = function(path) { return module.require(path); }; sandbox.require = function(path) { return module.require(path); };
@ -1326,6 +1337,8 @@ function require (path) {
} }
} }
sandbox.__filename = full;
sandbox.__dirname = full.split('/').slice(0, -1).join('/');
sandbox.module = appModule; sandbox.module = appModule;
sandbox.applicationContext = appContext; sandbox.applicationContext = appContext;

View File

@ -30,6 +30,7 @@
var graph = require("org/arangodb/graph"); var graph = require("org/arangodb/graph");
var arangodb = require("org/arangodb"); var arangodb = require("org/arangodb");
var BinaryHeap = require("org/arangodb/heap").BinaryHeap;
var ArangoError = arangodb.ArangoError; var ArangoError = arangodb.ArangoError;
var db = arangodb.db; var db = arangodb.db;
@ -37,9 +38,54 @@ var db = arangodb.db;
var ArangoTraverser; var ArangoTraverser;
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
// --SECTION-- public functions // --SECTION-- helper functions
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
////////////////////////////////////////////////////////////////////////////////
/// @brief clone any object
////////////////////////////////////////////////////////////////////////////////
function clone (obj) {
if (obj === null || typeof(obj) !== "object") {
return obj;
}
var copy, i;
if (Array.isArray(obj)) {
copy = [ ];
for (i = 0; i < obj.length; ++i) {
copy[i] = clone(obj[i]);
}
}
else if (obj instanceof Object) {
copy = { };
if (obj.hasOwnProperty) {
for (i in obj) {
if (obj.hasOwnProperty(i)) {
copy[i] = clone(obj[i]);
}
}
}
}
return copy;
}
////////////////////////////////////////////////////////////////////////////////
/// @brief traversal abortion exception
////////////////////////////////////////////////////////////////////////////////
var abortedException = function (message, options) {
'use strict';
this.message = message || "traversal intentionally aborted by user";
this.options = options || { };
this._intentionallyAborted = true;
};
abortedException.prototype = new Error();
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
// --SECTION-- datasources // --SECTION-- datasources
// ----------------------------------------------------------------------------- // -----------------------------------------------------------------------------
@ -365,35 +411,6 @@ function trackingVisitor (config, result, vertex, path) {
return; return;
} }
function clone (obj) {
if (obj === null || typeof(obj) !== "object") {
return obj;
}
var copy, i;
if (Array.isArray(obj)) {
copy = [ ];
for (i = 0; i < obj.length; ++i) {
copy[i] = clone(obj[i]);
}
}
else if (obj instanceof Object) {
copy = { };
if (obj.hasOwnProperty) {
for (i in obj) {
if (obj.hasOwnProperty(i)) {
copy[i] = clone(obj[i]);
}
}
}
}
return copy;
}
if (result.visited.vertices) { if (result.visited.vertices) {
result.visited.vertices.push(clone(vertex)); result.visited.vertices.push(clone(vertex));
} }
@ -554,7 +571,10 @@ function parseFilterResult (args) {
return; return;
} }
throw "invalid filter result"; var err = new ArangoError();
err.errorNum = arangodb.errors.ERROR_GRAPH_INVALID_FILTER_RESULT.code;
err.errorMessage = arangodb.errors.ERROR_GRAPH_INVALID_FILTER_RESULT.message;
throw err;
} }
processArgument(args); processArgument(args);
@ -628,6 +648,10 @@ function checkReverse (config) {
function breadthFirstSearch () { function breadthFirstSearch () {
return { return {
requiresEndVertex: function () {
return false;
},
getPathItems: function (id, items) { getPathItems: function (id, items) {
var visited = { }; var visited = { };
var ignore = items.length - 1; var ignore = items.length - 1;
@ -756,6 +780,10 @@ function breadthFirstSearch () {
function depthFirstSearch () { function depthFirstSearch () {
return { return {
requiresEndVertex: function () {
return false;
},
getPathItems: function (id, items) { getPathItems: function (id, items) {
var visited = { }; var visited = { };
items.forEach(function (item) { items.forEach(function (item) {
@ -853,6 +881,240 @@ function depthFirstSearch () {
}; };
} }
////////////////////////////////////////////////////////////////////////////////
/// @brief implementation details for dijkstra shortest path strategy
////////////////////////////////////////////////////////////////////////////////
function dijkstraSearch () {
return {
nodes: { },
requiresEndVertex: function () {
return true;
},
makeNode: function (vertex) {
var id = vertex._id;
if (! this.nodes.hasOwnProperty(id)) {
this.nodes[id] = { vertex: vertex, dist: Infinity };
}
return this.nodes[id];
},
vertexList: function (vertex) {
var result = [ ];
while (vertex) {
result.push(vertex);
vertex = vertex.parent;
}
return result;
},
buildPath: function (vertex) {
var path = { vertices: [ vertex.vertex ], edges: [ ] };
var v = vertex;
while (v.parent) {
path.vertices.unshift(v.parent.vertex);
path.edges.unshift(v.parentEdge);
v = v.parent;
}
return path;
},
run: function (config, result, startVertex, endVertex) {
var maxIterations = config.maxIterations, visitCounter = 0;
var heap = new BinaryHeap(function (node) {
return node.dist;
});
var startNode = this.makeNode(startVertex);
startNode.dist = 0;
heap.push(startNode);
while (heap.size() > 0) {
if (visitCounter++ > maxIterations) {
var err = new ArangoError();
err.errorNum = arangodb.errors.ERROR_GRAPH_TOO_MANY_ITERATIONS.code;
err.errorMessage = arangodb.errors.ERROR_GRAPH_TOO_MANY_ITERATIONS.message;
throw err;
}
var currentNode = heap.pop();
var i, n;
if (currentNode.vertex._id === endVertex._id) {
var vertices = this.vertexList(currentNode);
if (config.order !== ArangoTraverser.PRE_ORDER) {
vertices.reverse();
}
n = vertices.length;
for (i = 0; i < n; ++i) {
config.visitor(config, result, vertices[i].vertex, this.buildPath(vertices[i]));
}
return;
}
if (currentNode.visited) {
continue;
}
if (currentNode.dist === Infinity) {
break;
}
currentNode.visited = true;
var dist = currentNode.dist;
var path = this.buildPath(currentNode);
var connected = config.expander(config, currentNode.vertex, path);
n = connected.length;
for (i = 0; i < n; ++i) {
var neighbor = this.makeNode(connected[i].vertex);
if (neighbor.visited) {
continue;
}
var edge = connected[i].edge;
var weight = 1;
if (config.distance) {
weight = config.distance(config, currentNode.vertex, neighbor.vertex, edge);
}
var alt = dist + weight;
if (alt < neighbor.dist) {
neighbor.dist = alt;
neighbor.parent = currentNode;
neighbor.parentEdge = edge;
heap.push(neighbor);
}
}
}
}
};
}
////////////////////////////////////////////////////////////////////////////////
/// @brief implementation details for a* shortest path strategy
////////////////////////////////////////////////////////////////////////////////
function astarSearch () {
return {
nodes: { },
requiresEndVertex: function () {
return true;
},
makeNode: function (vertex) {
var id = vertex._id;
if (! this.nodes.hasOwnProperty(id)) {
this.nodes[id] = { vertex: vertex, f: 0, g: 0, h: 0 };
}
return this.nodes[id];
},
vertexList: function (vertex) {
var result = [ ];
while (vertex) {
result.push(vertex);
vertex = vertex.parent;
}
return result;
},
buildPath: function (vertex) {
var path = { vertices: [ vertex.vertex ], edges: [ ] };
var v = vertex;
while (v.parent) {
path.vertices.unshift(v.parent.vertex);
path.edges.unshift(v.parentEdge);
v = v.parent;
}
return path;
},
run: function (config, result, startVertex, endVertex) {
var maxIterations = config.maxIterations, visitCounter = 0;
var heap = new BinaryHeap(function (node) {
return node.f;
});
heap.push(this.makeNode(startVertex));
while (heap.size() > 0) {
if (visitCounter++ > maxIterations) {
var err = new ArangoError();
err.errorNum = arangodb.errors.ERROR_GRAPH_TOO_MANY_ITERATIONS.code;
err.errorMessage = arangodb.errors.ERROR_GRAPH_TOO_MANY_ITERATIONS.message;
throw err;
}
var currentNode = heap.pop();
var i, n;
if (currentNode.vertex._id === endVertex._id) {
var vertices = this.vertexList(currentNode);
if (config.order !== ArangoTraverser.PRE_ORDER) {
vertices.reverse();
}
n = vertices.length;
for (i = 0; i < n; ++i) {
config.visitor(config, result, vertices[i].vertex, this.buildPath(vertices[i]));
}
return;
}
currentNode.closed = true;
var path = this.buildPath(currentNode);
var connected = config.expander(config, currentNode.vertex, path);
n = connected.length;
for (i = 0; i < n; ++i) {
var neighbor = this.makeNode(connected[i].vertex);
if (neighbor.closed) {
continue;
}
var gScore = currentNode.g + 1;// + neighbor.cost;
var beenVisited = neighbor.visited;
if (! beenVisited || gScore < neighbor.g) {
var edge = connected[i].edge;
neighbor.visited = true;
neighbor.parent = currentNode;
neighbor.parentEdge = edge;
neighbor.h = 1;
if (config.distance && ! neighbor.h) {
neighbor.h = config.distance(config, neighbor.vertex, endVertex, edge);
}
neighbor.g = gScore;
neighbor.f = neighbor.g + neighbor.h;
if (! beenVisited) {
heap.push(neighbor);
}
else {
heap.rescoreElement(neighbor);
}
}
}
}
}
};
}
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @} /// @}
@ -958,7 +1220,9 @@ ArangoTraverser = function (config) {
config.strategy = validate(config.strategy, { config.strategy = validate(config.strategy, {
depthfirst: ArangoTraverser.DEPTH_FIRST, depthfirst: ArangoTraverser.DEPTH_FIRST,
breadthfirst: ArangoTraverser.BREADTH_FIRST breadthfirst: ArangoTraverser.BREADTH_FIRST,
astar: ArangoTraverser.ASTAR_SEARCH,
dijkstra: ArangoTraverser.DIJKSTRA_SEARCH
}, "strategy"); }, "strategy");
config.order = validate(config.order, { config.order = validate(config.order, {
@ -1053,23 +1317,54 @@ ArangoTraverser = function (config) {
/// @brief execute the traversal /// @brief execute the traversal
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
ArangoTraverser.prototype.traverse = function (result, startVertex) { ArangoTraverser.prototype.traverse = function (result, startVertex, endVertex) {
// check the start vertex
if (startVertex === undefined || startVertex === null) {
throw "invalid startVertex specified for traversal";
}
// get the traversal strategy // get the traversal strategy
var strategy; var strategy;
if (this.config.strategy === ArangoTraverser.BREADTH_FIRST) {
if (this.config.strategy === ArangoTraverser.ASTAR_SEARCH) {
strategy = astarSearch();
}
else if (this.config.strategy === ArangoTraverser.DIJKSTRA_SEARCH) {
strategy = dijkstraSearch();
}
else if (this.config.strategy === ArangoTraverser.BREADTH_FIRST) {
strategy = breadthFirstSearch(); strategy = breadthFirstSearch();
} }
else { else {
strategy = depthFirstSearch(); strategy = depthFirstSearch();
} }
// check the start vertex
if (startVertex === undefined ||
startVertex === null ||
typeof startVertex !== 'object') {
var err1 = new ArangoError();
err1.errorNum = arangodb.errors.ERROR_BAD_PARAMETER.code;
err1.errorMessage = arangodb.errors.ERROR_BAD_PARAMETER.message +
": invalid startVertex specified for traversal";
throw err1;
}
if (strategy.requiresEndVertex() &&
(endVertex === undefined ||
endVertex === null ||
typeof endVertex !== 'object')) {
var err2 = new ArangoError();
err2.errorNum = arangodb.errors.ERROR_BAD_PARAMETER.code;
err2.errorMessage = arangodb.errors.ERROR_BAD_PARAMETER.message +
": invalid endVertex specified for traversal";
throw err2;
}
// run the traversal // run the traversal
strategy.run(this.config, result, startVertex); try {
strategy.run(this.config, result, startVertex, endVertex);
}
catch (err3) {
if (typeof err3 !== "object" || ! err3._intentionallyAborted) {
throw err3;
}
}
}; };
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
@ -1115,6 +1410,18 @@ ArangoTraverser.BREADTH_FIRST = 0;
ArangoTraverser.DEPTH_FIRST = 1; ArangoTraverser.DEPTH_FIRST = 1;
////////////////////////////////////////////////////////////////////////////////
/// @brief astar search
////////////////////////////////////////////////////////////////////////////////
ArangoTraverser.ASTAR_SEARCH = 2;
////////////////////////////////////////////////////////////////////////////////
/// @brief dijkstra search
////////////////////////////////////////////////////////////////////////////////
ArangoTraverser.DIJKSTRA_SEARCH = 3;
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief pre-order traversal /// @brief pre-order traversal
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
@ -1180,6 +1487,7 @@ exports.visitAllFilter = visitAllFilter;
exports.maxDepthFilter = maxDepthFilter; exports.maxDepthFilter = maxDepthFilter;
exports.minDepthFilter = minDepthFilter; exports.minDepthFilter = minDepthFilter;
exports.includeMatchingAttributesFilter = includeMatchingAttributesFilter; exports.includeMatchingAttributesFilter = includeMatchingAttributesFilter;
exports.abortedException = abortedException;
exports.Traverser = ArangoTraverser; exports.Traverser = ArangoTraverser;

View File

@ -0,0 +1,189 @@
/*jslint indent: 2, nomen: true, maxlen: 100, sloppy: true, vars: true, white: true, plusplus: true, continue: true */
/*global exports */
////////////////////////////////////////////////////////////////////////////////
/// @brief binary min heap
///
/// @file
///
/// DISCLAIMER
///
/// Copyright 2011-2013 triagens GmbH, Cologne, Germany
///
/// Licensed under the Apache License, Version 2.0 (the "License");
/// you may not use this file except in compliance with the License.
/// You may obtain a copy of the License at
///
/// http://www.apache.org/licenses/LICENSE-2.0
///
/// Unless required by applicable law or agreed to in writing, software
/// distributed under the License is distributed on an "AS IS" BASIS,
/// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
/// See the License for the specific language governing permissions and
/// limitations under the License.
///
/// Copyright holder is triAGENS GmbH, Cologne, Germany
///
/// @author Jan Steemann
/// @author Copyright 2011-2013, triAGENS GmbH, Cologne, Germany
////////////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////////////
/// This file contains significant portions from the min heap published here:
/// http://github.com/bgrins/javascript-astar
/// Copyright (c) 2010, Brian Grinstead, http://briangrinstead.com
/// Freely distributable under the MIT License.
/// Includes Binary Heap (with modifications) from Marijn Haverbeke.
/// http://eloquentjavascript.net/appendix2.html
////////////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////////////
/// @brief constructor
////////////////////////////////////////////////////////////////////////////////
function BinaryHeap (scoreFunction) {
this.values = [ ];
this.scoreFunction = scoreFunction;
}
BinaryHeap.prototype = {
////////////////////////////////////////////////////////////////////////////////
/// @brief push an element into the heap
////////////////////////////////////////////////////////////////////////////////
push: function (element) {
this.values.push(element);
this._sinkDown(this.values.length - 1);
},
////////////////////////////////////////////////////////////////////////////////
/// @brief pop the min element from the heap
////////////////////////////////////////////////////////////////////////////////
pop: function () {
var result = this.values[0];
var end = this.values.pop();
if (this.values.length > 0) {
this.values[0] = end;
this._bubbleUp(0);
}
return result;
},
////////////////////////////////////////////////////////////////////////////////
/// @brief remove a specific element from the heap
////////////////////////////////////////////////////////////////////////////////
remove: function (node) {
var i = this.values.indexOf(node);
var end = this.values.pop();
if (i !== this.values.length - 1) {
this.values[i] = end;
if (this.scoreFunction(end) < this.scoreFunction(node)) {
this._sinkDown(i);
}
else {
this._bubbleUp(i);
}
}
},
////////////////////////////////////////////////////////////////////////////////
/// @brief return number of elements in heap
////////////////////////////////////////////////////////////////////////////////
size: function() {
return this.values.length;
},
////////////////////////////////////////////////////////////////////////////////
/// @brief reposition an element in the heap
////////////////////////////////////////////////////////////////////////////////
rescoreElement: function (node) {
this._sinkDown(this.values.indexOf(node));
},
////////////////////////////////////////////////////////////////////////////////
/// @brief move an element down the heap
////////////////////////////////////////////////////////////////////////////////
_sinkDown: function (n) {
var element = this.values[n];
while (n > 0) {
var parentN = Math.floor((n + 1) / 2) - 1,
parent = this.values[parentN];
if (this.scoreFunction(element) < this.scoreFunction(parent)) {
this.values[parentN] = element;
this.values[n] = parent;
n = parentN;
}
else {
break;
}
}
},
////////////////////////////////////////////////////////////////////////////////
/// @brief bubble up an element
////////////////////////////////////////////////////////////////////////////////
_bubbleUp: function (n) {
var length = this.values.length,
element = this.values[n],
elemScore = this.scoreFunction(element);
while (true) {
var child2n = (n + 1) * 2;
var child1n = child2n - 1;
var swap = null;
var child1Score;
if (child1n < length) {
var child1 = this.values[child1n];
child1Score = this.scoreFunction(child1);
if (child1Score < elemScore) {
swap = child1n;
}
}
if (child2n < length) {
var child2 = this.values[child2n];
var child2Score = this.scoreFunction(child2);
if (child2Score < (swap === null ? elemScore : child1Score)) {
swap = child2n;
}
}
if (swap !== null) {
this.values[n] = this.values[swap];
this.values[swap] = element;
n = swap;
}
else {
break;
}
}
}
};
// -----------------------------------------------------------------------------
// --SECTION-- MODULE EXPORTS
// -----------------------------------------------------------------------------
exports.BinaryHeap = BinaryHeap;
// -----------------------------------------------------------------------------
// --SECTION-- END-OF-FILE
// -----------------------------------------------------------------------------
// Local Variables:
// mode: outline-minor
// outline-regexp: "^\\(/// @brief\\|/// @addtogroup\\|// --SECTION--\\|/// @page\\|/// @\\}\\)"
// End:

View File

@ -3254,6 +3254,36 @@ function FIRST_DOCUMENT () {
return null; return null;
} }
////////////////////////////////////////////////////////////////////////////////
/// @brief return the parts of a document identifier separately
///
/// returns a document with the attributes `collection` and `key` or fails if
/// the individual parts cannot be determined.
////////////////////////////////////////////////////////////////////////////////
function PARSE_IDENTIFIER (value) {
"use strict";
if (TYPEWEIGHT(value) === TYPEWEIGHT_STRING) {
var parts = value.split('/');
if (parts.length === 2) {
return {
collection: parts[0],
key: parts[1]
};
}
// fall through intentional
}
else if (TYPEWEIGHT(value) === TYPEWEIGHT_DOCUMENT) {
if (value.hasOwnProperty('_id')) {
return PARSE_IDENTIFIER(value._id);
}
// fall through intentional
}
THROW(INTERNAL.errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH, "PARSE_IDENTIFIER");
}
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief check whether a document has a specific attribute /// @brief check whether a document has a specific attribute
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
@ -4048,6 +4078,7 @@ exports.GRAPH_NEIGHBORS = GRAPH_NEIGHBORS;
exports.NOT_NULL = NOT_NULL; exports.NOT_NULL = NOT_NULL;
exports.FIRST_LIST = FIRST_LIST; exports.FIRST_LIST = FIRST_LIST;
exports.FIRST_DOCUMENT = FIRST_DOCUMENT; exports.FIRST_DOCUMENT = FIRST_DOCUMENT;
exports.PARSE_IDENTIFIER = PARSE_IDENTIFIER;
exports.HAS = HAS; exports.HAS = HAS;
exports.ATTRIBUTES = ATTRIBUTES; exports.ATTRIBUTES = ATTRIBUTES;
exports.UNSET = UNSET; exports.UNSET = UNSET;

View File

@ -822,6 +822,17 @@ exports.scanAppDirectory = function () {
scanDirectory(module.appPath()); scanDirectory(module.appPath());
}; };
////////////////////////////////////////////////////////////////////////////////
/// @brief rescans the FOXX application directory
/// this function is a trampoline for scanAppDirectory
/// the shorter function name is only here to keep compatibility with the
/// client-side foxx manager
////////////////////////////////////////////////////////////////////////////////
exports.rescan = function () {
return exports.scanAppDirectory();
};
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief mounts a FOXX application /// @brief mounts a FOXX application
/// ///

View File

@ -1869,6 +1869,90 @@ function ahuacatlFunctionsTestSuite () {
assertEqual(expected, actual); assertEqual(expected, actual);
}, },
////////////////////////////////////////////////////////////////////////////////
/// @brief test parse identifier function
////////////////////////////////////////////////////////////////////////////////
testParseIdentifier : function () {
var actual;
actual = getQueryResults("RETURN PARSE_IDENTIFIER('foo/bar')");
assertEqual([ { collection: 'foo', key: 'bar' } ], actual);
actual = getQueryResults("RETURN PARSE_IDENTIFIER('this-is-a-collection-name/and-this-is-an-id')");
assertEqual([ { collection: 'this-is-a-collection-name', key: 'and-this-is-an-id' } ], actual);
actual = getQueryResults("RETURN PARSE_IDENTIFIER('MY_COLLECTION/MY_DOC')");
assertEqual([ { collection: 'MY_COLLECTION', key: 'MY_DOC' } ], actual);
actual = getQueryResults("RETURN PARSE_IDENTIFIER('_users/AbC')");
assertEqual([ { collection: '_users', key: 'AbC' } ], actual);
actual = getQueryResults("RETURN PARSE_IDENTIFIER({ _id: 'foo/bar', value: 'baz' })");
assertEqual([ { collection: 'foo', key: 'bar' } ], actual);
actual = getQueryResults("RETURN PARSE_IDENTIFIER({ ignore: true, _id: '_system/VALUE', value: 'baz' })");
assertEqual([ { collection: '_system', key: 'VALUE' } ], actual);
actual = getQueryResults("RETURN PARSE_IDENTIFIER({ value: 123, _id: 'Some-Odd-Collection/THIS_IS_THE_KEY' })");
assertEqual([ { collection: 'Some-Odd-Collection', key: 'THIS_IS_THE_KEY' } ], actual);
},
////////////////////////////////////////////////////////////////////////////////
/// @brief test parse identifier function
////////////////////////////////////////////////////////////////////////////////
testParseIdentifierCollection : function () {
var cn = "UnitTestsAhuacatlFunctions";
internal.db._drop(cn);
var cx = internal.db._create(cn);
cx.save({ "title" : "123", "value" : 456, "_key" : "foobar" });
cx.save({ "_key" : "so-this-is-it", "title" : "nada", "value" : 123 });
var expected, actual;
expected = [ { collection: cn, key: "foobar" } ];
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT(CONCAT(@cn, '/', @key)))", { cn: cn, key: "foobar" });
assertEqual(expected, actual);
expected = [ { collection: cn, key: "foobar" } ];
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT(CONCAT(@cn, '/', @key)))", { cn: cn, key: "foobar" });
assertEqual(expected, actual);
expected = [ { collection: cn, key: "foobar" } ];
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT(CONCAT(@cn, '/', 'foobar')))", { cn: cn });
assertEqual(expected, actual);
expected = [ { collection: cn, key: "foobar" } ];
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT([ @key ])[0])", { key: "UnitTestsAhuacatlFunctions/foobar" });
assertEqual(expected, actual);
expected = [ { collection: cn, key: "so-this-is-it" } ];
actual = getQueryResults("RETURN PARSE_IDENTIFIER(DOCUMENT([ 'UnitTestsAhuacatlFunctions/so-this-is-it' ])[0])");
assertEqual(expected, actual);
internal.db._drop(cn);
},
////////////////////////////////////////////////////////////////////////////////
/// @brief test parse identifier function
////////////////////////////////////////////////////////////////////////////////
testParseIdentifier : function () {
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_NUMBER_MISMATCH.code, "RETURN PARSE_IDENTIFIER()");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_NUMBER_MISMATCH.code, "RETURN PARSE_IDENTIFIER('foo', 'bar')");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER(null)");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER(false)");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER(3)");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER(\"foo\")");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER('foo bar')");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER('foo/bar/baz')");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER([ ])");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER({ })");
assertQueryError(errors.ERROR_QUERY_FUNCTION_ARGUMENT_TYPE_MISMATCH.code, "RETURN PARSE_IDENTIFIER({ foo: 'bar' })");
},
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief test document function /// @brief test document function
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////

View File

@ -413,15 +413,12 @@ void RestJobHandler::getJob () {
/// ///
/// @RESTURLPARAM{type,string,required} /// @RESTURLPARAM{type,string,required}
/// The type of jobs to delete. `type` can be: /// The type of jobs to delete. `type` can be:
///
/// - `all`: deletes all jobs results. Currently executing or queued async jobs /// - `all`: deletes all jobs results. Currently executing or queued async jobs
/// will not be stopped by this call. /// will not be stopped by this call.
///
/// - `expired`: deletes expired results. To determine the expiration status of /// - `expired`: deletes expired results. To determine the expiration status of
/// a result, pass the `stamp` URL parameter. `stamp` needs to be a UNIX /// a result, pass the `stamp` URL parameter. `stamp` needs to be a UNIX
/// timestamp, and all async job results created at a lower timestamp will be /// timestamp, and all async job results created at a lower timestamp will be
/// deleted. /// deleted.
///
/// - an actual job-id: in this case, the call will remove the result of the /// - an actual job-id: in this case, the call will remove the result of the
/// specified async job. If the job is currently executing or queued, it will /// specified async job. If the job is currently executing or queued, it will
/// not be aborted. /// not be aborted.

View File

@ -258,6 +258,7 @@ ERROR_GRAPH_INVALID_EDGE,1906,"invalid edge","Will be raised when an invalid edg
ERROR_GRAPH_COULD_NOT_CREATE_EDGE,1907,"could not create edge","Will be raised when the edge could not be created" ERROR_GRAPH_COULD_NOT_CREATE_EDGE,1907,"could not create edge","Will be raised when the edge could not be created"
ERROR_GRAPH_COULD_NOT_CHANGE_EDGE,1908,"could not change edge","Will be raised when the edge could not be changed" ERROR_GRAPH_COULD_NOT_CHANGE_EDGE,1908,"could not change edge","Will be raised when the edge could not be changed"
ERROR_GRAPH_TOO_MANY_ITERATIONS,1909,"too many iterations","Will be raised when too many iterations are done in a graph traversal" ERROR_GRAPH_TOO_MANY_ITERATIONS,1909,"too many iterations","Will be raised when too many iterations are done in a graph traversal"
ERROR_GRAPH_INVALID_FILTER_RESULT,1910,"invalid filter result","Will be raised when an invalid filter result is returned in a graph traversal"
################################################################################ ################################################################################
## Session errors ## Session errors

View File

@ -179,6 +179,7 @@ void TRI_InitialiseErrorMessages (void) {
REG_ERROR(ERROR_GRAPH_COULD_NOT_CREATE_EDGE, "could not create edge"); REG_ERROR(ERROR_GRAPH_COULD_NOT_CREATE_EDGE, "could not create edge");
REG_ERROR(ERROR_GRAPH_COULD_NOT_CHANGE_EDGE, "could not change edge"); REG_ERROR(ERROR_GRAPH_COULD_NOT_CHANGE_EDGE, "could not change edge");
REG_ERROR(ERROR_GRAPH_TOO_MANY_ITERATIONS, "too many iterations"); REG_ERROR(ERROR_GRAPH_TOO_MANY_ITERATIONS, "too many iterations");
REG_ERROR(ERROR_GRAPH_INVALID_FILTER_RESULT, "invalid filter result");
REG_ERROR(ERROR_SESSION_UNKNOWN, "unknown session"); REG_ERROR(ERROR_SESSION_UNKNOWN, "unknown session");
REG_ERROR(ERROR_SESSION_EXPIRED, "session expired"); REG_ERROR(ERROR_SESSION_EXPIRED, "session expired");
REG_ERROR(SIMPLE_CLIENT_UNKNOWN_ERROR, "unknown client error"); REG_ERROR(SIMPLE_CLIENT_UNKNOWN_ERROR, "unknown client error");

View File

@ -417,6 +417,9 @@ extern "C" {
/// Will be raised when the edge could not be changed /// Will be raised when the edge could not be changed
/// - 1909: @LIT{too many iterations} /// - 1909: @LIT{too many iterations}
/// Will be raised when too many iterations are done in a graph traversal /// Will be raised when too many iterations are done in a graph traversal
/// - 1910: @LIT{invalid filter result}
/// Will be raised when an invalid filter result is returned in a graph
/// traversal
/// - 1950: @LIT{unknown session} /// - 1950: @LIT{unknown session}
/// Will be raised when an invalid/unknown session id is passed to the server /// Will be raised when an invalid/unknown session id is passed to the server
/// - 1951: @LIT{session expired} /// - 1951: @LIT{session expired}
@ -2217,6 +2220,17 @@ void TRI_InitialiseErrorMessages (void);
#define TRI_ERROR_GRAPH_TOO_MANY_ITERATIONS (1909) #define TRI_ERROR_GRAPH_TOO_MANY_ITERATIONS (1909)
////////////////////////////////////////////////////////////////////////////////
/// @brief 1910: ERROR_GRAPH_INVALID_FILTER_RESULT
///
/// invalid filter result
///
/// Will be raised when an invalid filter result is returned in a graph
/// traversal
////////////////////////////////////////////////////////////////////////////////
#define TRI_ERROR_GRAPH_INVALID_FILTER_RESULT (1910)
//////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////
/// @brief 1950: ERROR_SESSION_UNKNOWN /// @brief 1950: ERROR_SESSION_UNKNOWN
/// ///