updates on external datasets and loading
diff --git a/asterix-doc/src/site/markdown/aql/manual.md b/asterix-doc/src/site/markdown/aql/manual.md
index f1c3fbd..97f70c4 100644
--- a/asterix-doc/src/site/markdown/aql/manual.md
+++ b/asterix-doc/src/site/markdown/aql/manual.md
@@ -552,7 +552,7 @@
An Internal dataset (the default) is a dataset that is stored in and managed by AsterixDB.
It must have a specified unique primary key that can be used to partition data across nodes of an AsterixDB cluster.
The primary key is also used in secondary indexes to uniquely identify the indexed primary data records.
-An External dataset is stored outside of AsterixDB, e.g., in HDFS or in the local filesystem(s) of the cluster's nodes.
+An External dataset is stored outside of AsterixDB (currently datasets in HDFS or on the local filesystem(s) of the cluster's nodes are supported).
External dataset support allows AQL queries to treat external data as though it were stored in AsterixDB,
making it possible to query "legacy" file data (e.g., Hive data) without having to physically import it into AsterixDB.
For an external dataset, an appropriate adaptor must be selected to handle the nature of the desired external data.
@@ -565,14 +565,15 @@
create internal dataset FacebookUsers(FacebookUserType) primary key id;
The next example creates an external dataset for storing LineitemType records.
-The choice of the `localfs` adaptor means that its data will reside in the local filesystem of the cluster nodes.
-The create statement provides several parameters used by the localfs adaptor;
-e.g., the file format is delimited text with vertical bar being the field delimiter.
+The choice of the `hdfs` adaptor means that its data will reside in HDFS.
+The create statement provides parameters used by the hdfs adaptor:
+the URL and path needed to locate the data in HDFS and a description of the data format.
##### Example
-
- create external dataset Lineitem(LineitemType) using localfs (
- ("path"="127.0.0.1://SOURCE_PATH"),
+ create external dataset Lineitem('LineitemType) using hdfs (
+ ("hdfs"="hdfs://HOST:PORT"),
+ ("path"="HDFS_PATH"),
+ ("input-format"="text-input-format"),
("format"="delimited-text"),
("delimiter"="|"));
@@ -672,6 +673,7 @@
The load statement is used to initially populate a dataset via bulk loading of data from an external file.
An appropriate adaptor must be selected to handle the nature of the desired external data.
+The load statement accepts the same adaptors and the same parameters as external datasets.
(See the [guide to external data](externaldata.html) for more information on the available adaptors.)
The following example shows how to bulk load the FacebookUsers dataset from an external file containing