mirror of https://gitee.com/bigwinds/arangodb
Merge branch 'devel' of ssh://github.com/ArangoDB/ArangoDB into devel
This commit is contained in:
commit
c7adab9088
|
@ -1,6 +1,12 @@
|
||||||
v2.8.0 (XXXX-XX-XX)
|
v2.8.0 (XXXX-XX-XX)
|
||||||
-------------------
|
-------------------
|
||||||
|
|
||||||
|
* better error reporting for arangodump and arangorestore
|
||||||
|
|
||||||
|
* arangodump will now fail by default when trying to dump edges that
|
||||||
|
refer to already dropped collections. This can be circumvented by
|
||||||
|
specifying the option `--force true` when invoking arangodump
|
||||||
|
|
||||||
* fixed cluster upgrade procedure
|
* fixed cluster upgrade procedure
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -1,9 +1,13 @@
|
||||||
!CHAPTER Foxx console
|
!CHAPTER Foxx console
|
||||||
|
|
||||||
Foxx injects a **console** object into each Foxx app that allows writing log entries to the database and querying them from within the app itself.
|
Foxx injects a **console** object into each Foxx app that allows writing log entries to the database in addition to the ArangoDB log file and querying them from within the app itself.
|
||||||
|
|
||||||
The **console** object supports the CommonJS Console API found in Node.js and modern browsers, while also providing some ArangoDB-specific additions.
|
The **console** object supports the CommonJS Console API found in Node.js and modern browsers, while also providing some ArangoDB-specific additions.
|
||||||
|
|
||||||
|
ArangoDB also provides [the `console` module](../../ModuleConsole/README.md) which only supports the CommonJS Console API and only writes log entries to the ArangoDB log.
|
||||||
|
|
||||||
|
When working with transactions, keep in mind that the Foxx console will attempt to write to the `_foxxlog` system collection. This behaviour can be disabled using the `setDatabaseLogging` method if you don't want to explicitly allow writing to the log collection during transactions or for performance reasons.
|
||||||
|
|
||||||
!SECTION Logging
|
!SECTION Logging
|
||||||
|
|
||||||
!SUBSECTION Logging console messages
|
!SUBSECTION Logging console messages
|
||||||
|
@ -19,8 +23,6 @@ If the first argument is not a formatting string or any of the additional argume
|
||||||
**Examples**
|
**Examples**
|
||||||
|
|
||||||
```js
|
```js
|
||||||
var console = require("console");
|
|
||||||
|
|
||||||
console.log("%s, %s!", "Hello", "World"); // => "Hello, World!"
|
console.log("%s, %s!", "Hello", "World"); // => "Hello, World!"
|
||||||
console.log("%s, World!", "Hello", "extra"); // => "Hello, World! extra"
|
console.log("%s, World!", "Hello", "extra"); // => "Hello, World! extra"
|
||||||
console.log("Hello,", "beautiful", "world!"); // => "Hello, beautiful world!"
|
console.log("Hello,", "beautiful", "world!"); // => "Hello, beautiful world!"
|
||||||
|
@ -40,11 +42,11 @@ By default, `console.log` uses log level **INFO**, making it functionally equiva
|
||||||
|
|
||||||
The built-in log levels are:
|
The built-in log levels are:
|
||||||
|
|
||||||
* -2: **TRACE**
|
* -200: **TRACE**
|
||||||
* -1: **DEBUG**
|
* -100: **DEBUG**
|
||||||
* 0: **INFO**
|
* 0: **INFO**
|
||||||
* 1: **WARN**
|
* 100: **WARN**
|
||||||
* 2: **ERROR**
|
* 200: **ERROR**
|
||||||
|
|
||||||
!SUBSECTION Logging with timers
|
!SUBSECTION Logging with timers
|
||||||
|
|
||||||
|
@ -160,7 +162,7 @@ This method returns a function that logs messages with the given log level (e.g.
|
||||||
**Parameter**
|
**Parameter**
|
||||||
|
|
||||||
* **name**: name of the log level as it appears in the database, usually all-uppercase
|
* **name**: name of the log level as it appears in the database, usually all-uppercase
|
||||||
* **value** (optional): value of the log level. Default: `999`
|
* **value** (optional): value of the log level. Default: `50`
|
||||||
|
|
||||||
The **value** is used when determining whether a log entry meets the minimum log level that can be defined in various places. For a list of the built-in log levels and their values see the section on logging with different log levels above.
|
The **value** is used when determining whether a log entry meets the minimum log level that can be defined in various places. For a list of the built-in log levels and their values see the section on logging with different log levels above.
|
||||||
|
|
||||||
|
@ -188,6 +190,24 @@ If **trace** is set to `true`, all log entries will be logged with a parsed stac
|
||||||
|
|
||||||
Because this results in every logging call creating a stack trace (which may have a significant performance impact), this option is disabled by default.
|
Because this results in every logging call creating a stack trace (which may have a significant performance impact), this option is disabled by default.
|
||||||
|
|
||||||
|
!SUBSECTION Disabling logging to the ArangoDB console
|
||||||
|
|
||||||
|
You can toggle whether logs should be written to the ArangoDB console.
|
||||||
|
|
||||||
|
`console.setNativeLogging(nativeLogging)`
|
||||||
|
|
||||||
|
If **nativeLogging** is set to `false`, log entries will not be logged to the ArangoDB console (which usually writes to the file system).
|
||||||
|
|
||||||
|
!SUBSECTION Disabling logging to the database
|
||||||
|
|
||||||
|
You can toggle whether logs should be written to the database.
|
||||||
|
|
||||||
|
`console.setDatabaseLogging(databaseLogging)`
|
||||||
|
|
||||||
|
If **databaseLogging** is set to `false`, log entries will not be logged to the internal `_foxxlog` collection.
|
||||||
|
|
||||||
|
This is only useful if logging to the ArangoDB console is not also disabled.
|
||||||
|
|
||||||
!SUBSECTION Enabling assertion errors
|
!SUBSECTION Enabling assertion errors
|
||||||
|
|
||||||
You can toggle whether console assertions should throw if they fail.
|
You can toggle whether console assertions should throw if they fail.
|
||||||
|
|
|
@ -2461,7 +2461,7 @@ int RestReplicationHandler::applyCollectionDumpMarker (CollectionNameResolver co
|
||||||
const TRI_voc_key_t key,
|
const TRI_voc_key_t key,
|
||||||
const TRI_voc_rid_t rid,
|
const TRI_voc_rid_t rid,
|
||||||
TRI_json_t const* json,
|
TRI_json_t const* json,
|
||||||
string& errorMsg) {
|
std::string& errorMsg) {
|
||||||
|
|
||||||
if (type == REPLICATION_MARKER_DOCUMENT ||
|
if (type == REPLICATION_MARKER_DOCUMENT ||
|
||||||
type == REPLICATION_MARKER_EDGE) {
|
type == REPLICATION_MARKER_EDGE) {
|
||||||
|
@ -2474,8 +2474,6 @@ int RestReplicationHandler::applyCollectionDumpMarker (CollectionNameResolver co
|
||||||
TRI_shaped_json_t* shaped = TRI_ShapedJsonJson(document->getShaper(), json, true); // PROTECTED by trx in trxCollection
|
TRI_shaped_json_t* shaped = TRI_ShapedJsonJson(document->getShaper(), json, true); // PROTECTED by trx in trxCollection
|
||||||
|
|
||||||
if (shaped == nullptr) {
|
if (shaped == nullptr) {
|
||||||
errorMsg = TRI_errno_string(TRI_ERROR_OUT_OF_MEMORY);
|
|
||||||
|
|
||||||
return TRI_ERROR_OUT_OF_MEMORY;
|
return TRI_ERROR_OUT_OF_MEMORY;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -2491,6 +2489,7 @@ int RestReplicationHandler::applyCollectionDumpMarker (CollectionNameResolver co
|
||||||
// edge
|
// edge
|
||||||
if (document->_info._type != TRI_COL_TYPE_EDGE) {
|
if (document->_info._type != TRI_COL_TYPE_EDGE) {
|
||||||
res = TRI_ERROR_ARANGO_COLLECTION_TYPE_INVALID;
|
res = TRI_ERROR_ARANGO_COLLECTION_TYPE_INVALID;
|
||||||
|
errorMsg = "expecting edge collection, got document collection";
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
res = TRI_ERROR_NO_ERROR;
|
res = TRI_ERROR_NO_ERROR;
|
||||||
|
@ -2503,11 +2502,13 @@ int RestReplicationHandler::applyCollectionDumpMarker (CollectionNameResolver co
|
||||||
TRI_document_edge_t edge;
|
TRI_document_edge_t edge;
|
||||||
if (! DocumentHelper::parseDocumentId(resolver, from.c_str(), edge._fromCid, &edge._fromKey)) {
|
if (! DocumentHelper::parseDocumentId(resolver, from.c_str(), edge._fromCid, &edge._fromKey)) {
|
||||||
res = TRI_ERROR_ARANGO_DOCUMENT_HANDLE_BAD;
|
res = TRI_ERROR_ARANGO_DOCUMENT_HANDLE_BAD;
|
||||||
|
errorMsg = std::string("handle bad or collection unknown '") + from.c_str() + "'";
|
||||||
}
|
}
|
||||||
|
|
||||||
// parse _to
|
// parse _to
|
||||||
if (! DocumentHelper::parseDocumentId(resolver, to.c_str(), edge._toCid, &edge._toKey)) {
|
if (! DocumentHelper::parseDocumentId(resolver, to.c_str(), edge._toCid, &edge._toKey)) {
|
||||||
res = TRI_ERROR_ARANGO_DOCUMENT_HANDLE_BAD;
|
res = TRI_ERROR_ARANGO_DOCUMENT_HANDLE_BAD;
|
||||||
|
errorMsg = std::string("handle bad or collection unknown '") + to.c_str() + "'";
|
||||||
}
|
}
|
||||||
|
|
||||||
if (res == TRI_ERROR_NO_ERROR) {
|
if (res == TRI_ERROR_NO_ERROR) {
|
||||||
|
@ -2519,6 +2520,7 @@ int RestReplicationHandler::applyCollectionDumpMarker (CollectionNameResolver co
|
||||||
// document
|
// document
|
||||||
if (document->_info._type != TRI_COL_TYPE_DOCUMENT) {
|
if (document->_info._type != TRI_COL_TYPE_DOCUMENT) {
|
||||||
res = TRI_ERROR_ARANGO_COLLECTION_TYPE_INVALID;
|
res = TRI_ERROR_ARANGO_COLLECTION_TYPE_INVALID;
|
||||||
|
errorMsg = std::string(TRI_errno_string(res)) + ": expecting document collection, got edge collection";
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
res = TRI_InsertShapedJsonDocumentCollection(trxCollection, key, rid, nullptr, &mptr, shaped, nullptr, false, false, true);
|
res = TRI_InsertShapedJsonDocumentCollection(trxCollection, key, rid, nullptr, &mptr, shaped, nullptr, false, false, true);
|
||||||
|
@ -2610,13 +2612,9 @@ int RestReplicationHandler::processRestoreDataBatch (CollectionNameResolver cons
|
||||||
|
|
||||||
if (pos - ptr > 1) {
|
if (pos - ptr > 1) {
|
||||||
// found something
|
// found something
|
||||||
TRI_json_t* json = TRI_JsonString(TRI_CORE_MEM_ZONE, ptr);
|
std::unique_ptr<TRI_json_t> json(TRI_JsonString(TRI_CORE_MEM_ZONE, ptr));
|
||||||
|
|
||||||
if (! JsonHelper::isObject(json)) {
|
|
||||||
if (json != nullptr) {
|
|
||||||
TRI_FreeJson(TRI_CORE_MEM_ZONE, json);
|
|
||||||
}
|
|
||||||
|
|
||||||
|
if (! JsonHelper::isObject(json.get())) {
|
||||||
errorMsg = invalidMsg;
|
errorMsg = invalidMsg;
|
||||||
|
|
||||||
return TRI_ERROR_HTTP_CORRUPTED_JSON;
|
return TRI_ERROR_HTTP_CORRUPTED_JSON;
|
||||||
|
@ -2627,20 +2625,19 @@ int RestReplicationHandler::processRestoreDataBatch (CollectionNameResolver cons
|
||||||
TRI_voc_rid_t rid = 0;
|
TRI_voc_rid_t rid = 0;
|
||||||
TRI_json_t const* doc = nullptr;
|
TRI_json_t const* doc = nullptr;
|
||||||
|
|
||||||
size_t const n = TRI_LengthVector(&json->_value._objects);
|
size_t const n = TRI_LengthVector(&(json.get()->_value._objects));
|
||||||
|
|
||||||
for (size_t i = 0; i < n; i += 2) {
|
for (size_t i = 0; i < n; i += 2) {
|
||||||
TRI_json_t const* element = static_cast<TRI_json_t const*>(TRI_AtVector(&json->_value._objects, i));
|
auto const* element = static_cast<TRI_json_t const*>(TRI_AtVector(&(json.get()->_value._objects), i));
|
||||||
|
|
||||||
if (! JsonHelper::isString(element)) {
|
if (! JsonHelper::isString(element)) {
|
||||||
TRI_FreeJson(TRI_CORE_MEM_ZONE, json);
|
|
||||||
errorMsg = invalidMsg;
|
errorMsg = invalidMsg;
|
||||||
|
|
||||||
return TRI_ERROR_HTTP_CORRUPTED_JSON;
|
return TRI_ERROR_HTTP_CORRUPTED_JSON;
|
||||||
}
|
}
|
||||||
|
|
||||||
const char* attributeName = element->_value._string.data;
|
char const* attributeName = element->_value._string.data;
|
||||||
TRI_json_t const* value = static_cast<TRI_json_t const*>(TRI_AtVector(&json->_value._objects, i + 1));
|
auto const* value = static_cast<TRI_json_t const*>(TRI_AtVector(&(json.get()->_value._objects), i + 1));
|
||||||
|
|
||||||
if (TRI_EqualString(attributeName, "type")) {
|
if (TRI_EqualString(attributeName, "type")) {
|
||||||
if (JsonHelper::isNumber(value)) {
|
if (JsonHelper::isNumber(value)) {
|
||||||
|
@ -2669,7 +2666,6 @@ int RestReplicationHandler::processRestoreDataBatch (CollectionNameResolver cons
|
||||||
|
|
||||||
// key must not be 0, but doc can be 0!
|
// key must not be 0, but doc can be 0!
|
||||||
if (key == nullptr) {
|
if (key == nullptr) {
|
||||||
TRI_FreeJson(TRI_CORE_MEM_ZONE, json);
|
|
||||||
errorMsg = invalidMsg;
|
errorMsg = invalidMsg;
|
||||||
|
|
||||||
return TRI_ERROR_HTTP_BAD_PARAMETER;
|
return TRI_ERROR_HTTP_BAD_PARAMETER;
|
||||||
|
@ -2677,8 +2673,6 @@ int RestReplicationHandler::processRestoreDataBatch (CollectionNameResolver cons
|
||||||
|
|
||||||
int res = applyCollectionDumpMarker(resolver, trxCollection, type, (const TRI_voc_key_t) key, rid, doc, errorMsg);
|
int res = applyCollectionDumpMarker(resolver, trxCollection, type, (const TRI_voc_key_t) key, rid, doc, errorMsg);
|
||||||
|
|
||||||
TRI_FreeJson(TRI_CORE_MEM_ZONE, json);
|
|
||||||
|
|
||||||
if (res != TRI_ERROR_NO_ERROR && ! force) {
|
if (res != TRI_ERROR_NO_ERROR && ! force) {
|
||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
@ -2698,7 +2692,7 @@ int RestReplicationHandler::processRestoreData (CollectionNameResolver const& re
|
||||||
TRI_voc_cid_t cid,
|
TRI_voc_cid_t cid,
|
||||||
bool useRevision,
|
bool useRevision,
|
||||||
bool force,
|
bool force,
|
||||||
string& errorMsg) {
|
std::string& errorMsg) {
|
||||||
|
|
||||||
SingleCollectionWriteTransaction<UINT64_MAX> trx(new StandaloneTransactionContext(), _vocbase, cid);
|
SingleCollectionWriteTransaction<UINT64_MAX> trx(new StandaloneTransactionContext(), _vocbase, cid);
|
||||||
|
|
||||||
|
@ -2767,13 +2761,18 @@ void RestReplicationHandler::handleCommandRestoreData () {
|
||||||
force = StringUtils::boolean(value);
|
force = StringUtils::boolean(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
string errorMsg;
|
std::string errorMsg;
|
||||||
|
|
||||||
int res = processRestoreData(resolver, cid, recycleIds, force, errorMsg);
|
int res = processRestoreData(resolver, cid, recycleIds, force, errorMsg);
|
||||||
|
|
||||||
if (res != TRI_ERROR_NO_ERROR) {
|
if (res != TRI_ERROR_NO_ERROR) {
|
||||||
|
if (errorMsg.empty()) {
|
||||||
generateError(HttpResponse::responseCode(res), res);
|
generateError(HttpResponse::responseCode(res), res);
|
||||||
}
|
}
|
||||||
|
else {
|
||||||
|
generateError(HttpResponse::responseCode(res), res, std::string(TRI_errno_string(res)) + ": " + errorMsg);
|
||||||
|
}
|
||||||
|
}
|
||||||
else {
|
else {
|
||||||
TRI_json_t result;
|
TRI_json_t result;
|
||||||
|
|
||||||
|
@ -3413,6 +3412,9 @@ void RestReplicationHandler::handleCommandRemoveKeys () {
|
||||||
/// @RESTQUERYPARAM{includeSystem,boolean,optional}
|
/// @RESTQUERYPARAM{includeSystem,boolean,optional}
|
||||||
/// Include system collections in the result. The default value is *true*.
|
/// Include system collections in the result. The default value is *true*.
|
||||||
///
|
///
|
||||||
|
/// @RESTQUERYPARAM{failOnUnknown,boolean,optional}
|
||||||
|
/// Produce an error when dumped edges refer to now-unknown collections.
|
||||||
|
///
|
||||||
/// @RESTQUERYPARAM{ticks,boolean,optional}
|
/// @RESTQUERYPARAM{ticks,boolean,optional}
|
||||||
/// Whether or not to include tick values in the dump. The default value is *true*.
|
/// Whether or not to include tick values in the dump. The default value is *true*.
|
||||||
///
|
///
|
||||||
|
@ -3545,6 +3547,7 @@ void RestReplicationHandler::handleCommandDump () {
|
||||||
bool flush = true; // flush WAL before dumping?
|
bool flush = true; // flush WAL before dumping?
|
||||||
bool withTicks = true;
|
bool withTicks = true;
|
||||||
bool translateCollectionIds = true;
|
bool translateCollectionIds = true;
|
||||||
|
bool failOnUnknown = false;
|
||||||
uint64_t flushWait = 0;
|
uint64_t flushWait = 0;
|
||||||
|
|
||||||
bool found;
|
bool found;
|
||||||
|
@ -3557,6 +3560,13 @@ void RestReplicationHandler::handleCommandDump () {
|
||||||
flush = StringUtils::boolean(value);
|
flush = StringUtils::boolean(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// fail on unknown collection names referenced in edges
|
||||||
|
value = _request->value("failOnUnknown", found);
|
||||||
|
|
||||||
|
if (found) {
|
||||||
|
failOnUnknown = StringUtils::boolean(value);
|
||||||
|
}
|
||||||
|
|
||||||
// determine flush WAL wait time value
|
// determine flush WAL wait time value
|
||||||
value = _request->value("flushWait", found);
|
value = _request->value("flushWait", found);
|
||||||
|
|
||||||
|
@ -3639,7 +3649,7 @@ void RestReplicationHandler::handleCommandDump () {
|
||||||
// initialize the dump container
|
// initialize the dump container
|
||||||
TRI_replication_dump_t dump(_vocbase, (size_t) determineChunkSize(), includeSystem);
|
TRI_replication_dump_t dump(_vocbase, (size_t) determineChunkSize(), includeSystem);
|
||||||
|
|
||||||
res = TRI_DumpCollectionReplication(&dump, col, tickStart, tickEnd, withTicks, translateCollectionIds);
|
res = TRI_DumpCollectionReplication(&dump, col, tickStart, tickEnd, withTicks, translateCollectionIds, failOnUnknown);
|
||||||
|
|
||||||
if (res != TRI_ERROR_NO_ERROR) {
|
if (res != TRI_ERROR_NO_ERROR) {
|
||||||
THROW_ARANGO_EXCEPTION(res);
|
THROW_ARANGO_EXCEPTION(res);
|
||||||
|
|
|
@ -153,6 +153,7 @@ static char const* NameFromCid (TRI_replication_dump_t* dump,
|
||||||
static int AppendCollection (TRI_replication_dump_t* dump,
|
static int AppendCollection (TRI_replication_dump_t* dump,
|
||||||
TRI_voc_cid_t cid,
|
TRI_voc_cid_t cid,
|
||||||
bool translateCollectionIds,
|
bool translateCollectionIds,
|
||||||
|
bool failOnUnknown,
|
||||||
triagens::arango::CollectionNameResolver* resolver) {
|
triagens::arango::CollectionNameResolver* resolver) {
|
||||||
if (translateCollectionIds) {
|
if (translateCollectionIds) {
|
||||||
if (cid > 0) {
|
if (cid > 0) {
|
||||||
|
@ -164,10 +165,20 @@ static int AppendCollection (TRI_replication_dump_t* dump,
|
||||||
name = resolver->getCollectionName(cid);
|
name = resolver->getCollectionName(cid);
|
||||||
}
|
}
|
||||||
APPEND_STRING(dump->_buffer, name.c_str());
|
APPEND_STRING(dump->_buffer, name.c_str());
|
||||||
|
if (failOnUnknown &&
|
||||||
|
name[0] == '_' &&
|
||||||
|
name == "_unknown") {
|
||||||
|
return TRI_ERROR_ARANGO_DOCUMENT_HANDLE_BAD;
|
||||||
|
}
|
||||||
|
|
||||||
return TRI_ERROR_NO_ERROR;
|
return TRI_ERROR_NO_ERROR;
|
||||||
}
|
}
|
||||||
|
|
||||||
APPEND_STRING(dump->_buffer, "_unknown");
|
APPEND_STRING(dump->_buffer, "_unknown");
|
||||||
|
|
||||||
|
if (failOnUnknown) {
|
||||||
|
return TRI_ERROR_ARANGO_DOCUMENT_HANDLE_BAD;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
APPEND_UINT64(dump->_buffer, (uint64_t) cid);
|
APPEND_UINT64(dump->_buffer, (uint64_t) cid);
|
||||||
|
@ -291,6 +302,7 @@ static int StringifyMarkerDump (TRI_replication_dump_t* dump,
|
||||||
TRI_df_marker_t const* marker,
|
TRI_df_marker_t const* marker,
|
||||||
bool withTicks,
|
bool withTicks,
|
||||||
bool translateCollectionIds,
|
bool translateCollectionIds,
|
||||||
|
bool failOnUnknown,
|
||||||
triagens::arango::CollectionNameResolver* resolver) {
|
triagens::arango::CollectionNameResolver* resolver) {
|
||||||
// This covers two cases:
|
// This covers two cases:
|
||||||
// 1. document is not nullptr and marker points into a data file
|
// 1. document is not nullptr and marker points into a data file
|
||||||
|
@ -427,7 +439,7 @@ static int StringifyMarkerDump (TRI_replication_dump_t* dump,
|
||||||
|
|
||||||
int res;
|
int res;
|
||||||
APPEND_STRING(buffer, ",\"" TRI_VOC_ATTRIBUTE_FROM "\":\"");
|
APPEND_STRING(buffer, ",\"" TRI_VOC_ATTRIBUTE_FROM "\":\"");
|
||||||
res = AppendCollection(dump, fromCid, translateCollectionIds, resolver);
|
res = AppendCollection(dump, fromCid, translateCollectionIds, failOnUnknown, resolver);
|
||||||
|
|
||||||
if (res != TRI_ERROR_NO_ERROR) {
|
if (res != TRI_ERROR_NO_ERROR) {
|
||||||
return res;
|
return res;
|
||||||
|
@ -436,7 +448,7 @@ static int StringifyMarkerDump (TRI_replication_dump_t* dump,
|
||||||
APPEND_STRING(buffer, "\\/");
|
APPEND_STRING(buffer, "\\/");
|
||||||
APPEND_STRING(buffer, fromKey);
|
APPEND_STRING(buffer, fromKey);
|
||||||
APPEND_STRING(buffer, "\",\"" TRI_VOC_ATTRIBUTE_TO "\":\"");
|
APPEND_STRING(buffer, "\",\"" TRI_VOC_ATTRIBUTE_TO "\":\"");
|
||||||
res = AppendCollection(dump, toCid, translateCollectionIds, resolver);
|
res = AppendCollection(dump, toCid, translateCollectionIds, failOnUnknown, resolver);
|
||||||
|
|
||||||
if (res != TRI_ERROR_NO_ERROR) {
|
if (res != TRI_ERROR_NO_ERROR) {
|
||||||
return res;
|
return res;
|
||||||
|
@ -1178,6 +1190,7 @@ static int DumpCollection (TRI_replication_dump_t* dump,
|
||||||
TRI_voc_tick_t dataMax,
|
TRI_voc_tick_t dataMax,
|
||||||
bool withTicks,
|
bool withTicks,
|
||||||
bool translateCollectionIds,
|
bool translateCollectionIds,
|
||||||
|
bool failOnUnknown,
|
||||||
triagens::arango::CollectionNameResolver* resolver) {
|
triagens::arango::CollectionNameResolver* resolver) {
|
||||||
TRI_string_buffer_t* buffer;
|
TRI_string_buffer_t* buffer;
|
||||||
TRI_voc_tick_t lastFoundTick;
|
TRI_voc_tick_t lastFoundTick;
|
||||||
|
@ -1332,7 +1345,7 @@ static int DumpCollection (TRI_replication_dump_t* dump,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
res = StringifyMarkerDump(dump, document, marker, withTicks, translateCollectionIds, resolver);
|
res = StringifyMarkerDump(dump, document, marker, withTicks, translateCollectionIds, failOnUnknown, resolver);
|
||||||
|
|
||||||
if (res != TRI_ERROR_NO_ERROR) {
|
if (res != TRI_ERROR_NO_ERROR) {
|
||||||
break; // will go to NEXT_DF
|
break; // will go to NEXT_DF
|
||||||
|
@ -1395,7 +1408,8 @@ int TRI_DumpCollectionReplication (TRI_replication_dump_t* dump,
|
||||||
TRI_voc_tick_t dataMin,
|
TRI_voc_tick_t dataMin,
|
||||||
TRI_voc_tick_t dataMax,
|
TRI_voc_tick_t dataMax,
|
||||||
bool withTicks,
|
bool withTicks,
|
||||||
bool translateCollectionIds) {
|
bool translateCollectionIds,
|
||||||
|
bool failOnUnknown) {
|
||||||
TRI_ASSERT(col != nullptr);
|
TRI_ASSERT(col != nullptr);
|
||||||
TRI_ASSERT(col->_collection != nullptr);
|
TRI_ASSERT(col->_collection != nullptr);
|
||||||
|
|
||||||
|
@ -1416,7 +1430,7 @@ int TRI_DumpCollectionReplication (TRI_replication_dump_t* dump,
|
||||||
|
|
||||||
try {
|
try {
|
||||||
res = DumpCollection(dump, document, dataMin, dataMax, withTicks,
|
res = DumpCollection(dump, document, dataMin, dataMax, withTicks,
|
||||||
translateCollectionIds, &resolver);
|
translateCollectionIds, failOnUnknown, &resolver);
|
||||||
}
|
}
|
||||||
catch (...) {
|
catch (...) {
|
||||||
res = TRI_ERROR_INTERNAL;
|
res = TRI_ERROR_INTERNAL;
|
||||||
|
|
|
@ -120,6 +120,7 @@ int TRI_DumpCollectionReplication (TRI_replication_dump_t*,
|
||||||
TRI_voc_tick_t,
|
TRI_voc_tick_t,
|
||||||
TRI_voc_tick_t,
|
TRI_voc_tick_t,
|
||||||
bool,
|
bool,
|
||||||
|
bool,
|
||||||
bool);
|
bool);
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
|
@ -282,7 +282,7 @@ static void LocalExitFunction (int exitCode, void* data) {
|
||||||
|
|
||||||
static string GetHttpErrorMessage (SimpleHttpResult* result) {
|
static string GetHttpErrorMessage (SimpleHttpResult* result) {
|
||||||
StringBuffer const& body = result->getBody();
|
StringBuffer const& body = result->getBody();
|
||||||
string details;
|
std::string details;
|
||||||
|
|
||||||
std::unique_ptr<TRI_json_t> json(JsonHelper::fromString(body.c_str(), body.length()));
|
std::unique_ptr<TRI_json_t> json(JsonHelper::fromString(body.c_str(), body.length()));
|
||||||
|
|
||||||
|
@ -505,6 +505,13 @@ static int DumpCollection (int fd,
|
||||||
url += "&to=" + StringUtils::itoa(maxTick);
|
url += "&to=" + StringUtils::itoa(maxTick);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (Force) {
|
||||||
|
url += "&failOnUnknown=false";
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
url += "&failOnUnknown=true";
|
||||||
|
}
|
||||||
|
|
||||||
Stats._totalBatches++;
|
Stats._totalBatches++;
|
||||||
|
|
||||||
std::unique_ptr<SimpleHttpResult> response(Client->request(HttpRequest::HTTP_REQUEST_GET,
|
std::unique_ptr<SimpleHttpResult> response(Client->request(HttpRequest::HTTP_REQUEST_GET,
|
||||||
|
@ -746,9 +753,11 @@ static int RunDump (string& errorMsg) {
|
||||||
return TRI_ERROR_INTERNAL;
|
return TRI_ERROR_INTERNAL;
|
||||||
}
|
}
|
||||||
|
|
||||||
const string cid = JsonHelper::getStringValue(parameters, "cid", "");
|
std::string const cid = JsonHelper::getStringValue(parameters, "cid", "");
|
||||||
const string name = JsonHelper::getStringValue(parameters, "name", "");
|
std::string const name = JsonHelper::getStringValue(parameters, "name", "");
|
||||||
const bool deleted = JsonHelper::getBooleanValue(parameters, "deleted", false);
|
bool const deleted = JsonHelper::getBooleanValue(parameters, "deleted", false);
|
||||||
|
int type = JsonHelper::getNumericValue<int>(parameters, "type", 2);
|
||||||
|
std::string const collectionType(type == 2 ? "document" : "edge");
|
||||||
|
|
||||||
if (cid == "" || name == "") {
|
if (cid == "" || name == "") {
|
||||||
errorMsg = "got malformed JSON response from server";
|
errorMsg = "got malformed JSON response from server";
|
||||||
|
@ -774,7 +783,7 @@ static int RunDump (string& errorMsg) {
|
||||||
|
|
||||||
// found a collection!
|
// found a collection!
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
cout << "dumping collection '" << name << "'..." << endl;
|
cout << "# Dumping " << collectionType << " collection '" << name << "'..." << endl;
|
||||||
}
|
}
|
||||||
|
|
||||||
// now save the collection meta data and/or the actual data
|
// now save the collection meta data and/or the actual data
|
||||||
|
@ -1054,7 +1063,7 @@ static int RunClusterDump (string& errorMsg) {
|
||||||
|
|
||||||
// found a collection!
|
// found a collection!
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
cout << "dumping collection '" << name << "'..." << endl;
|
cout << "# Dumping collection '" << name << "'..." << endl;
|
||||||
}
|
}
|
||||||
|
|
||||||
// now save the collection meta data and/or the actual data
|
// now save the collection meta data and/or the actual data
|
||||||
|
@ -1124,7 +1133,7 @@ static int RunClusterDump (string& errorMsg) {
|
||||||
string shardName = it->first;
|
string shardName = it->first;
|
||||||
string DBserver = it->second;
|
string DBserver = it->second;
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
cout << "dumping shard '" << shardName << "' from DBserver '"
|
cout << "# Dumping shard '" << shardName << "' from DBserver '"
|
||||||
<< DBserver << "' ..." << endl;
|
<< DBserver << "' ..." << endl;
|
||||||
}
|
}
|
||||||
res = StartBatch(DBserver, errorMsg);
|
res = StartBatch(DBserver, errorMsg);
|
||||||
|
@ -1381,6 +1390,17 @@ int main (int argc, char* argv[]) {
|
||||||
res = TRI_ERROR_INTERNAL;
|
res = TRI_ERROR_INTERNAL;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (res != TRI_ERROR_NO_ERROR) {
|
||||||
|
if (! errorMsg.empty()) {
|
||||||
|
cerr << "Error: " << errorMsg << endl;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
cerr << "An error occurred" << endl;
|
||||||
|
}
|
||||||
|
ret = EXIT_FAILURE;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
if (DumpData) {
|
if (DumpData) {
|
||||||
cout << "Processed " << Stats._totalCollections << " collection(s), " <<
|
cout << "Processed " << Stats._totalCollections << " collection(s), " <<
|
||||||
|
@ -1392,13 +1412,6 @@ int main (int argc, char* argv[]) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (res != TRI_ERROR_NO_ERROR) {
|
|
||||||
if (! errorMsg.empty()) {
|
|
||||||
cerr << "Error: " << errorMsg << endl;
|
|
||||||
}
|
|
||||||
ret = EXIT_FAILURE;
|
|
||||||
}
|
|
||||||
|
|
||||||
delete Client;
|
delete Client;
|
||||||
|
|
||||||
TRIAGENS_REST_SHUTDOWN;
|
TRIAGENS_REST_SHUTDOWN;
|
||||||
|
|
|
@ -281,18 +281,16 @@ static void LocalExitFunction (int exitCode, void* data) {
|
||||||
/// @brief extract an error message from a response
|
/// @brief extract an error message from a response
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
static string GetHttpErrorMessage (SimpleHttpResult* result) {
|
static std::string GetHttpErrorMessage (SimpleHttpResult* result) {
|
||||||
const StringBuffer& body = result->getBody();
|
StringBuffer const& body = result->getBody();
|
||||||
string details;
|
std::string details;
|
||||||
LastErrorCode = TRI_ERROR_NO_ERROR;
|
LastErrorCode = TRI_ERROR_NO_ERROR;
|
||||||
|
|
||||||
TRI_json_t* json = JsonHelper::fromString(body.c_str(), body.length());
|
std::unique_ptr<TRI_json_t> json(JsonHelper::fromString(body.c_str(), body.length()));
|
||||||
|
|
||||||
if (json != nullptr) {
|
if (json != nullptr) {
|
||||||
const string& errorMessage = JsonHelper::getStringValue(json, "errorMessage", "");
|
std::string const& errorMessage = JsonHelper::getStringValue(json.get(), "errorMessage", "");
|
||||||
const int errorNum = JsonHelper::getNumericValue<int>(json, "errorNum", 0);
|
int const errorNum = JsonHelper::getNumericValue<int>(json.get(), "errorNum", 0);
|
||||||
|
|
||||||
TRI_FreeJson(TRI_UNKNOWN_MEM_ZONE, json);
|
|
||||||
|
|
||||||
if (errorMessage != "" && errorNum > 0) {
|
if (errorMessage != "" && errorNum > 0) {
|
||||||
details = ": ArangoError " + StringUtils::itoa(errorNum) + ": " + errorMessage;
|
details = ": ArangoError " + StringUtils::itoa(errorNum) + ": " + errorMessage;
|
||||||
|
@ -311,7 +309,6 @@ static string GetHttpErrorMessage (SimpleHttpResult* result) {
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
static int TryCreateDatabase (std::string const& name) {
|
static int TryCreateDatabase (std::string const& name) {
|
||||||
|
|
||||||
triagens::basics::Json json(triagens::basics::Json::Object);
|
triagens::basics::Json json(triagens::basics::Json::Object);
|
||||||
json("name", triagens::basics::Json(name));
|
json("name", triagens::basics::Json(name));
|
||||||
|
|
||||||
|
@ -466,6 +463,9 @@ static int SendRestoreCollection (TRI_json_t const* json,
|
||||||
|
|
||||||
if (response->wasHttpError()) {
|
if (response->wasHttpError()) {
|
||||||
errorMsg = GetHttpErrorMessage(response.get());
|
errorMsg = GetHttpErrorMessage(response.get());
|
||||||
|
if (LastErrorCode != TRI_ERROR_NO_ERROR) {
|
||||||
|
return LastErrorCode;
|
||||||
|
}
|
||||||
|
|
||||||
return TRI_ERROR_INTERNAL;
|
return TRI_ERROR_INTERNAL;
|
||||||
}
|
}
|
||||||
|
@ -495,6 +495,9 @@ static int SendRestoreIndexes (TRI_json_t const* json,
|
||||||
|
|
||||||
if (response->wasHttpError()) {
|
if (response->wasHttpError()) {
|
||||||
errorMsg = GetHttpErrorMessage(response.get());
|
errorMsg = GetHttpErrorMessage(response.get());
|
||||||
|
if (LastErrorCode != TRI_ERROR_NO_ERROR) {
|
||||||
|
return LastErrorCode;
|
||||||
|
}
|
||||||
|
|
||||||
return TRI_ERROR_INTERNAL;
|
return TRI_ERROR_INTERNAL;
|
||||||
}
|
}
|
||||||
|
@ -528,6 +531,9 @@ static int SendRestoreData (string const& cname,
|
||||||
|
|
||||||
if (response->wasHttpError()) {
|
if (response->wasHttpError()) {
|
||||||
errorMsg = GetHttpErrorMessage(response.get());
|
errorMsg = GetHttpErrorMessage(response.get());
|
||||||
|
if (LastErrorCode != TRI_ERROR_NO_ERROR) {
|
||||||
|
return LastErrorCode;
|
||||||
|
}
|
||||||
|
|
||||||
return TRI_ERROR_INTERNAL;
|
return TRI_ERROR_INTERNAL;
|
||||||
}
|
}
|
||||||
|
@ -563,7 +569,7 @@ static int SortCollections (const void* l,
|
||||||
/// @brief process all files from the input directory
|
/// @brief process all files from the input directory
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
static int ProcessInputDirectory (string& errorMsg) {
|
static int ProcessInputDirectory (std::string& errorMsg) {
|
||||||
// create a lookup table for collections
|
// create a lookup table for collections
|
||||||
map<string, bool> restrictList;
|
map<string, bool> restrictList;
|
||||||
for (size_t i = 0; i < Collections.size(); ++i) {
|
for (size_t i = 0; i < Collections.size(); ++i) {
|
||||||
|
@ -674,20 +680,23 @@ static int ProcessInputDirectory (string& errorMsg) {
|
||||||
// step2: run the actual import
|
// step2: run the actual import
|
||||||
{
|
{
|
||||||
for (size_t i = 0; i < n; ++i) {
|
for (size_t i = 0; i < n; ++i) {
|
||||||
TRI_json_t const* json = (TRI_json_t const*) TRI_AtVector(&collections->_value._objects, i);
|
TRI_json_t const* json = static_cast<TRI_json_t const*>(TRI_AtVector(&collections->_value._objects, i));
|
||||||
TRI_json_t const* parameters = JsonHelper::getObjectElement(json, "parameters");
|
TRI_json_t const* parameters = JsonHelper::getObjectElement(json, "parameters");
|
||||||
TRI_json_t const* indexes = JsonHelper::getObjectElement(json, "indexes");
|
TRI_json_t const* indexes = JsonHelper::getObjectElement(json, "indexes");
|
||||||
const string cname = JsonHelper::getStringValue(parameters, "name", "");
|
std::string const cname = JsonHelper::getStringValue(parameters, "name", "");
|
||||||
const string cid = JsonHelper::getStringValue(parameters, "cid", "");
|
std::string const cid = JsonHelper::getStringValue(parameters, "cid", "");
|
||||||
|
|
||||||
|
int type = JsonHelper::getNumericValue<int>(parameters, "type", 2);
|
||||||
|
std::string const collectionType(type == 2 ? "document" : "edge");
|
||||||
|
|
||||||
if (ImportStructure) {
|
if (ImportStructure) {
|
||||||
// re-create collection
|
// re-create collection
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
if (Overwrite) {
|
if (Overwrite) {
|
||||||
cout << "Re-creating collection '" << cname << "'..." << endl;
|
cout << "# Re-creating " << collectionType << " collection '" << cname << "'..." << endl;
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
cout << "Creating collection '" << cname << "'..." << endl;
|
cout << "# Creating " << collectionType << " collection '" << cname << "'..." << endl;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -718,7 +727,7 @@ static int ProcessInputDirectory (string& errorMsg) {
|
||||||
// found a datafile
|
// found a datafile
|
||||||
|
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
cout << "Loading data into collection '" << cname << "'..." << endl;
|
cout << "# Loading data into " << collectionType << " collection '" << cname << "'..." << endl;
|
||||||
}
|
}
|
||||||
|
|
||||||
int fd = TRI_OPEN(datafile.c_str(), O_RDONLY);
|
int fd = TRI_OPEN(datafile.c_str(), O_RDONLY);
|
||||||
|
@ -829,7 +838,7 @@ static int ProcessInputDirectory (string& errorMsg) {
|
||||||
if (TRI_LengthVector(&indexes->_value._objects) > 0) {
|
if (TRI_LengthVector(&indexes->_value._objects) > 0) {
|
||||||
// we actually have indexes
|
// we actually have indexes
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
cout << "Creating indexes for collection '" << cname << "'..." << endl;
|
cout << "# Creating indexes for collection '" << cname << "'..." << endl;
|
||||||
}
|
}
|
||||||
|
|
||||||
int res = SendRestoreIndexes(json, errorMsg);
|
int res = SendRestoreIndexes(json, errorMsg);
|
||||||
|
@ -1005,7 +1014,7 @@ int main (int argc, char* argv[]) {
|
||||||
}
|
}
|
||||||
|
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
cout << "Connected to ArangoDB '" << BaseClient.endpointServer()->getSpecification() << endl;
|
cout << "# Connected to ArangoDB '" << BaseClient.endpointServer()->getSpecification() << "'" << endl;
|
||||||
}
|
}
|
||||||
|
|
||||||
memset(&Stats, 0, sizeof(Stats));
|
memset(&Stats, 0, sizeof(Stats));
|
||||||
|
@ -1025,6 +1034,16 @@ int main (int argc, char* argv[]) {
|
||||||
res = TRI_ERROR_INTERNAL;
|
res = TRI_ERROR_INTERNAL;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (res != TRI_ERROR_NO_ERROR) {
|
||||||
|
if (! errorMsg.empty()) {
|
||||||
|
cerr << "Error: " << errorMsg << endl;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
cerr << "An error occurred" << endl;
|
||||||
|
}
|
||||||
|
ret = EXIT_FAILURE;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
if (Progress) {
|
if (Progress) {
|
||||||
if (ImportData) {
|
if (ImportData) {
|
||||||
|
@ -1037,14 +1056,6 @@ int main (int argc, char* argv[]) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
if (res != TRI_ERROR_NO_ERROR) {
|
|
||||||
if (! errorMsg.empty()) {
|
|
||||||
cerr << "Error: " << errorMsg << endl;
|
|
||||||
}
|
|
||||||
ret = EXIT_FAILURE;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (Client != nullptr) {
|
if (Client != nullptr) {
|
||||||
delete Client;
|
delete Client;
|
||||||
}
|
}
|
||||||
|
|
|
@ -337,6 +337,7 @@ exports.infoLines = function () {
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
|
|
||||||
exports.log = exports.info;
|
exports.log = exports.info;
|
||||||
|
exports._log = log;
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////////////////////
|
||||||
/// @brief logLines
|
/// @brief logLines
|
||||||
|
|
|
@ -81,11 +81,13 @@ var createRoutePlannerGraph = function() {
|
||||||
);
|
);
|
||||||
|
|
||||||
var g = Graph._create("routeplanner", edgeDefinition);
|
var g = Graph._create("routeplanner", edgeDefinition);
|
||||||
var berlin = g.germanCity.save({_key: "Berlin", population : 3000000, isCapital : true});
|
var berlin = g.germanCity.save({_key: "Berlin", population : 3000000, isCapital : true, loc: [52.5167, 13.3833]});
|
||||||
var cologne = g.germanCity.save({_key: "Cologne", population : 1000000, isCapital : false});
|
var cologne = g.germanCity.save({_key: "Cologne", population : 1000000, isCapital : false, loc: [50.9364, 6.9528]});
|
||||||
var hamburg = g.germanCity.save({_key: "Hamburg", population : 1000000, isCapital : false});
|
var hamburg = g.germanCity.save({_key: "Hamburg", population : 1000000, isCapital : false, loc: [53.5653, 10.0014]});
|
||||||
var lyon = g.frenchCity.save({_key: "Lyon", population : 80000, isCapital : false});
|
var lyon = g.frenchCity.save({_key: "Lyon", population : 80000, isCapital : false, loc: [45.7600, 4.8400]});
|
||||||
var paris = g.frenchCity.save({_key: "Paris", population : 4000000, isCapital : true});
|
var paris = g.frenchCity.save({_key: "Paris", population : 4000000, isCapital : true, loc: [48.8567, 2.3508]});
|
||||||
|
g.germanCity.ensureGeoIndex("loc");
|
||||||
|
g.frenchCity.ensureGeoIndex("loc");
|
||||||
g.germanHighway.save(berlin._id, cologne._id, {distance: 850});
|
g.germanHighway.save(berlin._id, cologne._id, {distance: 850});
|
||||||
g.germanHighway.save(berlin._id, hamburg._id, {distance: 400});
|
g.germanHighway.save(berlin._id, hamburg._id, {distance: 400});
|
||||||
g.germanHighway.save(hamburg._id, cologne._id, {distance: 500});
|
g.germanHighway.save(hamburg._id, cologne._id, {distance: 500});
|
||||||
|
|
|
@ -31,11 +31,38 @@
|
||||||
var qb = require('aqb');
|
var qb = require('aqb');
|
||||||
var util = require('util');
|
var util = require('util');
|
||||||
var extend = require('underscore').extend;
|
var extend = require('underscore').extend;
|
||||||
|
var arangoConsole = require('console');
|
||||||
var ErrorStackParser = require('error-stack-parser');
|
var ErrorStackParser = require('error-stack-parser');
|
||||||
var AssertionError = require('assert').AssertionError;
|
var AssertionError = require('assert').AssertionError;
|
||||||
var exists = require('org/arangodb/is').existy;
|
var exists = require('org/arangodb/is').existy;
|
||||||
var db = require('org/arangodb').db;
|
var db = require('org/arangodb').db;
|
||||||
|
|
||||||
|
const NATIVE_LOG_LEVELS = ['debug', 'info', 'warn', 'error'];
|
||||||
|
|
||||||
|
function nativeLogger(level, levelNum, mount) {
|
||||||
|
let logLevel = String(level).toLowerCase();
|
||||||
|
if (logLevel === 'trace' && levelNum === -200) {
|
||||||
|
logLevel = 'info'; // require('console').trace also uses INFO level
|
||||||
|
}
|
||||||
|
if (NATIVE_LOG_LEVELS.indexOf(logLevel) !== -1) {
|
||||||
|
return function (message) {
|
||||||
|
arangoConsole._log(logLevel, `${mount} ${message}`);
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (levelNum >= 200) {
|
||||||
|
logLevel = 'error';
|
||||||
|
} else if (levelNum >= 100) {
|
||||||
|
logLevel = 'warn';
|
||||||
|
} else if (levelNum <= -100) {
|
||||||
|
logLevel = 'debug';
|
||||||
|
} else {
|
||||||
|
logLevel = 'info';
|
||||||
|
}
|
||||||
|
return function (message) {
|
||||||
|
arangoConsole._log(logLevel, `(${level}) ${mount} ${message}`);
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
function ConsoleLogs(console) {
|
function ConsoleLogs(console) {
|
||||||
this._console = console;
|
this._console = console;
|
||||||
this.defaultMaxAge = 2 * 60 * 60 * 1000;
|
this.defaultMaxAge = 2 * 60 * 60 * 1000;
|
||||||
|
@ -131,8 +158,10 @@ function Console(mount, tracing) {
|
||||||
this._mount = mount;
|
this._mount = mount;
|
||||||
this._timers = Object.create(null);
|
this._timers = Object.create(null);
|
||||||
this._tracing = Boolean(tracing);
|
this._tracing = Boolean(tracing);
|
||||||
|
this._nativeLogging = true;
|
||||||
|
this._databaseLogging = true;
|
||||||
this._logLevel = -999;
|
this._logLevel = -999;
|
||||||
this._logLevels = {TRACE: -2};
|
this._logLevels = {TRACE: -200};
|
||||||
this._assertThrows = false;
|
this._assertThrows = false;
|
||||||
this.logs = new ConsoleLogs(this);
|
this.logs = new ConsoleLogs(this);
|
||||||
|
|
||||||
|
@ -142,10 +171,10 @@ function Console(mount, tracing) {
|
||||||
}
|
}
|
||||||
}.bind(this));
|
}.bind(this));
|
||||||
|
|
||||||
this.debug = this.custom('DEBUG', -1);
|
this.debug = this.custom('DEBUG', -100);
|
||||||
this.info = this.custom('INFO', 0);
|
this.info = this.custom('INFO', 0);
|
||||||
this.warn = this.custom('WARN', 1);
|
this.warn = this.custom('WARN', 100);
|
||||||
this.error = this.custom('ERROR', 2);
|
this.error = this.custom('ERROR', 200);
|
||||||
|
|
||||||
this.assert.level = 'ERROR';
|
this.assert.level = 'ERROR';
|
||||||
this.dir.level = 'INFO';
|
this.dir.level = 'INFO';
|
||||||
|
@ -170,14 +199,28 @@ extend(Console.prototype, {
|
||||||
level: level,
|
level: level,
|
||||||
levelNum: this._logLevels[level],
|
levelNum: this._logLevels[level],
|
||||||
time: Date.now(),
|
time: Date.now(),
|
||||||
message: message
|
message: String(message)
|
||||||
};
|
};
|
||||||
|
|
||||||
|
let logLine;
|
||||||
|
|
||||||
|
if (this._nativeLogging) {
|
||||||
|
logLine = nativeLogger(level, doc.levelNum, doc.mount);
|
||||||
|
doc.message.split('\n').forEach(logLine);
|
||||||
|
}
|
||||||
|
|
||||||
if (this._tracing) {
|
if (this._tracing) {
|
||||||
var e = new Error();
|
let e = new Error();
|
||||||
Error.captureStackTrace(e, callee || this._log);
|
Error.captureStackTrace(e, callee || this._log);
|
||||||
e.stack = e.stack.replace(/\n+$/, '');
|
e.stack = e.stack.replace(/\n+$/, '');
|
||||||
doc.stack = ErrorStackParser.parse(e).slice(1);
|
doc.stack = ErrorStackParser.parse(e).slice(1);
|
||||||
|
if (this._nativeLogging) {
|
||||||
|
e.stack.split('\n').slice(2).forEach(logLine);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!this._databaseLogging) {
|
||||||
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!db._foxxlog) {
|
if (!db._foxxlog) {
|
||||||
|
@ -240,7 +283,7 @@ extend(Console.prototype, {
|
||||||
custom: function (level, weight) {
|
custom: function (level, weight) {
|
||||||
level = String(level);
|
level = String(level);
|
||||||
weight = Number(weight);
|
weight = Number(weight);
|
||||||
weight = weight === weight ? weight : 999;
|
weight = weight === weight ? weight : 50;
|
||||||
this._logLevels[level] = weight;
|
this._logLevels[level] = weight;
|
||||||
var logWithLevel = function() {
|
var logWithLevel = function() {
|
||||||
this._log(level, util.format.apply(null, arguments), logWithLevel);
|
this._log(level, util.format.apply(null, arguments), logWithLevel);
|
||||||
|
@ -264,6 +307,16 @@ extend(Console.prototype, {
|
||||||
return this._tracing;
|
return this._tracing;
|
||||||
},
|
},
|
||||||
|
|
||||||
|
setNativeLogging: function (nativeLogging) {
|
||||||
|
this._nativeLogging = Boolean(nativeLogging);
|
||||||
|
return this._nativeLogging;
|
||||||
|
},
|
||||||
|
|
||||||
|
setDatabaseLogging: function (databaseLogging) {
|
||||||
|
this._databaseLogging = Boolean(databaseLogging);
|
||||||
|
return this._databaseLogging;
|
||||||
|
},
|
||||||
|
|
||||||
setAssertThrows: function (assertThrows) {
|
setAssertThrows: function (assertThrows) {
|
||||||
this._assertThrows = Boolean(assertThrows);
|
this._assertThrows = Boolean(assertThrows);
|
||||||
return this._assertThrows;
|
return this._assertThrows;
|
||||||
|
|
|
@ -346,6 +346,7 @@ class FoxxService {
|
||||||
filename = path.resolve(this.main.context.__dirname, filename);
|
filename = path.resolve(this.main.context.__dirname, filename);
|
||||||
|
|
||||||
var module = new Module(filename, this.main);
|
var module = new Module(filename, this.main);
|
||||||
|
module.context.console = this.main.context.console;
|
||||||
module.context.applicationContext = _.extend(
|
module.context.applicationContext = _.extend(
|
||||||
new AppContext(this.main.context.applicationContext._service),
|
new AppContext(this.main.context.applicationContext._service),
|
||||||
this.main.context.applicationContext,
|
this.main.context.applicationContext,
|
||||||
|
|
Loading…
Reference in New Issue