Welcome to the Couchbase Shell cbsh
documentation! If you are new, you will want to start with the quickstart section and then proceed with the introduction. If you are already familar with the shell, feel free to jump right into the command reference.
Note that while the project is maintained by Couchbase, it is not covered under the EE support contract. We are providing community support through this bug tracker.
1. Quickstart
1.1. Installation
The current latest version is 0.75. The version of cbsh
is kept in line with the underlying version of nushell that is supported.
There are a couple ways you can get access to cbsh
, the easiest one is to download our pre-built binaries for your platform of choice:
-
macOS: cbsh-x86_64-apple-darwin.zip
-
Linux aarch64: cbsh-aarch64-unknown-linux-gnu.tar.gz
-
macOS aarch64: cbsh-aarch64-apple-darwin.zip
-
Windows: cbsh-x86_64-pc-windows-msvc.zip
Once you’ve downloaded the zip
file, extract it and switch into the just created directory.
The following example shows it for mac, but it works very similar if you are on linux (just align the commands with the file you just downloaded):
$ unzip cbsh-x86_64-apple-darwin.zip
$ ls
cbsh LICENSE LICENSE_AGREEMENT README.md
You can now run the cbsh
binary:
โฏ ./cbsh --version
The Couchbase Shell 0.75.1
Tip
|
If you are running a recent macOS release (i.e. 10.15.x), you’ll likely see an error similar to "cbsh" was blocked from use because it is not from an identified developer.
This is because our binaries are not yet signed.
To run it nonetheless you need to either navigate to System Preferences → Security & Privacy and click Allow Anyway , or run sudo xattr -r -d com.apple.quarantine $PWD/cbsh inside your terminal.
Next time you run the binary you’ll get another prompt but then it should run fine.
|

1.2. Connecting to a Database
The first time that you run ./cbsh
you will receive a prompt asking if you’d like to create a config file.
If you choose yes then the shell will provide you with a series of prompts to provide information about your default database.
If you choose no then it will try to connect to localhost
using the Administrator
username and the password
password.
You can modify this through CLI arguments (see ./cbsh -h
for more information).
Note: Unless you specify TLS settings then PLAIN authentication is used and your credentials are sent in plaintext.
โฏ ./cbsh --username Administrator --connstr 127.0.0.1 -p
Password:
[WARN] 2023-04-14 08:32:25.180 Using PLAIN authentication for cluster dev.local, credentials will sent in plaintext - configure tls to disable this warning
[INFO] 2023-04-14 08:32:25.389 Thanks for trying CBSH!
๐ค Administrator ๐ dev.local in ๐ default
>
Once in the shell, you can start to execute commands (see the introduction section for more information). As a quick sanity check, list the nodes in the database:
> nodes
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโฌโโโโโโโโโโฎ
โ # โ cluster โ hostname โ status โ services โ version โ os โ memory_total โ memory_free โ capella โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ dev.local โ 127.0.0.1:8091 โ healthy โ search,indexing,kv,query โ 8.0.0-1246-enterprise โ x86_64-apple-darwin19.6.0 โ 34359738368 โ 12026126336 โ false โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโดโโโโโโโโโโฏ
Or if you have the travel-sample
bucket installed you can switch to it and then fetch a document:
> doc get airline_10 --bucket travel-sample | flatten
โโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโฌโโโโโโโโโ
# โ id โ cas โ content_id โ type โ name โ iata โ icao โ callsign โ country โ error โ database
โโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโ
0 โ airline_10 โ 1629809626107281408 โ 10 โ airline โ 40-Mile Air โ Q5 โ MLA โ MILE-AIR โ United States โ โ default
โโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโดโโโโโโโโโ
1.3. The config dotfiles
Connecting to a single database through the command line is nice when you are starting out, but later on you will likely either connect to the same database all the time or even to a multitude of them. To help with this, you can create a .cbsh
dot folder in your home directory and place a config
file in it that the shell will read on startup.
The downloaded zip contains an example already, but here is a small sample config to help you get started as well:
version = 1
[[database]]
identifier = "local"
connstr = "127.0.0.1"
default-bucket = "travel-sample"
username = "Administrator"
password = "password"
[[database]]
identifier = "remote"
connstr = "10.143.200.101"
default-bucket = "myapp"
username = "user"
password = "pass"
This will register two databases, one called local
and one called remote
.
The file format is toml
in case you wonder.
Now when you start the shell, it will connect to local
automatically and you are all set.
Please check out the reference section on additional parameters you can set as well as how to move the credentials to a separate credentials
file in case you want to share your config with other people and they do not use the same credentials.
2. Introduction
Couchbase Shell is fully featured, so it does not only contain commands related to couchbase but is actually built on top of a general purpose shell called nushell. This allows you to interact with the file system or any other command available on your machine, making it a great tool for both operational and development tasks on top of Couchbase.
The following introduction only touches on the basic concepts to make you productive quickly. We recommend also checking out the great nushell documentation so you can get the most out of it.
2.1. Navigating the Shell
Commands take inputs and produce output in a structured manner, most often represented as tables. Note how both the generic ls
command and the couchbase-specific buckets
command both produce a table as their output:
> ls
โโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโ
# โ name โ type โ size โ modified
โโโโโผโโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโ
0 โ CHANGELOG.md โ File โ 4.8 KB โ 2 hours ago
1 โ Cargo.lock โ File โ 170.2 KB โ 16 minutes ago
2 โ Cargo.toml โ File โ 1.8 KB โ 16 minutes ago
3 โ LICENSE โ File โ 11.4 KB โ 2 days ago
4 โ README.md โ File โ 8.6 KB โ 9 minutes ago
5 โ docs โ Dir โ 544 B โ 2 days ago
6 โ examples โ Dir โ 192 B โ 2 days ago
7 โ jupyter โ Dir โ 128 B โ 2 days ago
8 โ src โ Dir โ 256 B โ 2 days ago
9 โ target โ Dir โ 224 B โ 32 minutes ago
10 โ tests โ Dir โ 224 B โ 2 days ago
โโโโโดโโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโ
> buckets
โโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโ
# โ database โ name โ type โ replicas โ min_durability_level โ ram_quota โ flush_enabled โ status โ cloud
โโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโผโโโโโโโ
0 โ default โ beer-sample โ couchbase โ 1 โ none โ 209.7 MB โ false โ โ false
1 โ default โ default โ couchbase โ 1 โ none โ 104.9 MB โ true โ โ false
2 โ default โ targetBucket โ couchbase โ 0 โ none โ 104.9 MB โ true โ โ false
3 โ default โ travel-sample โ couchbase โ 1 โ none โ 209.7 MB โ false โ โ false
โโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโดโโโโโโโ
You can pipe the output into other commands, for example if you only want to see buckets that have sample
in their name you can utilize the where
command:
> buckets | where name =~ "sample"
โโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโ
# โ database โ name โ type โ replicas โ min_durability_level โ ram_quota โ flush_enabled โ status โ cloud
โโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโผโโโโโโโ
0 โ default โ beer-sample โ couchbase โ 1 โ none โ 209.7 MB โ false โ โ false
1 โ default โ travel-sample โ couchbase โ 1 โ none โ 209.7 MB โ false โ โ false
โโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโดโโโโโโโ
In a similar fashion you can turn this structured table into other output formats, for example JSON:
> buckets | where name =~ "sample" | to json --pretty 2
[
{
"database": "default",
"name": "beer-sample",
"type": "couchbase",
"replicas": 1,
"min_durability_level": "none",
"ram_quota": 209715200,
"flush_enabled": false,
"status": "",
"cloud": false
},
{
"database": "default",
"name": "travel-sample",
"type": "couchbase",
"replicas": 1,
"min_durability_level": "none",
"ram_quota": 209715200,
"flush_enabled": false,
"status": "",
"cloud": false
}
]
Exactly this type of composition takes the unix philosophy of "do one thing well" and meshes it together with the idea of flexible structured pipelines. This allows to build powerful compositions that help you in your daily operations with Couchbase, both from a developer or operations point of view.
2.2. Getting Help
Other than using this documentation for help, each command can be called with -h
or --help
to get information about potential flags, arguments and subcommands. Also, some commands provide additional examples.
> buckets -h
Perform bucket management operations
Usage:
> buckets <subcommand> {flags}
Subcommands:
buckets config - Shows the bucket config (low level)
buckets create - Creates a bucket
buckets drop - Drops buckets through the HTTP API
buckets flush - Flushes buckets through the HTTP API
buckets get - Fetches buckets through the HTTP API
buckets load-sample - Load a sample bucket
buckets update - Updates a bucket
Flags:
-h, --help: Display this help message
--databases <string>: the databases which should be contacted
Some commands (like the one above) only act as groupings for subcommands, like from
, to
or doc
. Since they do not serve a purpose on their own, they will render their help output automatically:
> doc
Perform document operations against a bucket or collection
Usage:
> doc <subcommand> {flags}
Subcommands:
doc get - Fetches a document through the data service
doc insert - Insert a document through the data service
doc remove - Removes a document through the data service
doc replace - Replace a document through the data service
doc upsert - Upsert (insert or override) a document through the data service
Flags:
-h, --help: Display this help message
2.3. The Prompt explained
Couchbase Shell uses a custom, two line prompt to show you exactly in what environment you are working in right now. Since you can connect to different databases, switch buckets etc. it is important to know what is currently "active". Here is a sample prompt that will greet you when starting the shell:
๐ค Administrator at ๐ local in ๐ travel-sample
>
It tells you that your user is Administrator
, the current active database identifier is local
and the active bucket is travel-sample
.
If you have an active scope or collection set then the prompt will also update to reflect that:
๐ค Administrator ๐ dev.local in ๐ travel-sample.myscope.mycollection
>
In the second line, your actual user prompt starts.
2.4. Loading Data into the Shell
If you want to import data into Couchbase, or just load it into the shell for further processing, there are different commands available to help you.
Once the data is loaded into the shell it can be sent to one of the couchbase save commands like doc upsert
and doc import
.
Depending on the structure of the data, and the command used, you may also need to tweak it a little bit so it can be properly stored.
2.4.1. Doc import
The doc import
is the simplest way to import data through Couchbase Shell.
The command expects a path to file containing data, the data can be any of the formats supported by the from
command.
In the following example we import a json file containing a single document.
We also specify the --id-column
flag because doc import
will try to use an id
field for the document key by default.
> cat user.json
{
"name": "Michael",
"age": 32,
"height": 180
}
> doc import user.json --id-column name
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 1 โ 1 โ 0 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโฏ
> doc import -h
Import documents from a file through the data service
Usage:
> doc import {flags} <filename>
Flags:
-h, --help - Display the help message for this command
--id-column <String> - the name of the id column if used with an input stream
--bucket <String> - the name of the bucket
--expiry <Number> - the expiry for the documents in seconds, or absolute
--scope <String> - the name of the scope
--collection <String> - the name of the collection
--clusters <String> - the clusters which should be contacted
--batch-size <Number> - the maximum number of items to batch send at a time
Parameters:
filename <string>: the path to the file containing data to import
2.4.2. Manual import
The open
command will look at file endings and try to decode it automatically.
Imagine a file named user.json
in your current directory.
> cat user.json
{
"name": "Michael",
"age": 32,
"height": 180
}
> open user.json
โญโโโโโโโโโฌโโโโโโโโโโฎ
โ name โ Michael โ
โ age โ 32 โ
โ height โ 180 โ
โฐโโโโโโโโโดโโโโโโโโโโฏ
As you can see, the open
command automatically decoded the JSON document into the tabular format.
If the filename would only be user
, the import would look like this instead:
> open user
{
"name": "Michael",
"age": 32,
"height": 180
}
If you are dealing with data that cannot be decoded automatically, you can use the various from
subcommands to help with decoding.
In our case we use from json
:
> open user | from json
โญโโโโโโโโโฌโโโโโโโโโโฎ
โ name โ Michael โ
โ age โ 32 โ
โ height โ 180 โ
โฐโโโโโโโโโดโโโโโโโโโโฏ
Tip
|
look at the many different import formats from supports, including csv, xml, yaml and even sqlite. With this simple tool at hand you are able to load many different data formats quickly and import them into couchbase!
|
We cannot use this format directly with commands like doc upsert
as the command expects two "columns" in the data - id and content.
This means that we have to perform some translation from the above format to one that doc upsert
understands.
To do this we wrap
the entire document into a content column and then extract the id that we want to use:
> open user.json | wrap content | insert id {|it| echo $it.content.name}
โญโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฎ
โ content โ {record 3 fields} โ
โ id โ Michael โ
โฐโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโฏ
There are many other approaches to achieving this same result. With our data in the correct format we can then upsert:
> open user.json | wrap content | insert id {|it| echo $it.content.name} | doc upsert
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 1 โ 1 โ 0 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโฏ
See the Importing data recipes for more information.
2.5. Exporting Data from the Shell
The export counterparts to open
and from
, are save
and to
.
You can use both commands to take tabular data from the shell and store it in files of the needed target format.
Like open
, save
will try to discern the format from the file ending.
The following example will load a JSON file and then save it as CSV:
> cat user.json
{
"name": "Michael",
"age": 32,
"height": 180
}
> open user.json | save user.csv
> cat user.csv
name,age,height
Michael,32,180
This example is dealing with only one row for simplicity, but you can save as many rows as you need in one file.
As a motivating example, the following snippet runs a N1QL query and stores the result as a csv file:
> query "select airportname,city,country from `travel-sample` where type = 'airport' limit 10" | save output.csv
> cat output.csv
airportname,city,country
Calais Dunkerque,Calais,France
Peronne St Quentin,Peronne,France
Les Loges,Nangis,France
Couterne,Bagnole-de-l'orne,France
Bray,Albert,France
Le Touquet Paris Plage,Le Tourquet,France
Denain,Valenciennes,France
Glisy,Amiens,France
La Garenne,Agen,France
Cazaux,Cazaux,France
See the Exporting data recipes for more information.
3. cb-env
and the Environment
Whilst multiple databases can be registered at the same time, there is only ever one database (at most) active. The same is true for buckets, scopes, and collections. When a resource is active then it used as the default to run commands against (this can be overridden on a per command basis).
You can run the cb-env
command, which will tell you which resources are currently active (you are also able to tell from the prompt):
> cb-env
โญโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโฎ
โ username โ charlie โ
โ display_name โ Charlie โ
โ database โ capella โ
โ bucket โ default โ
โ scope โ inventory โ
โ collection โ hotel โ
โ cluster_type โ provisioned โ
โฐโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโฏ
If you were to now run a command then we would be running it:
-
As the user "charlie"
-
Against the "capella" database
-
Against the "default" bucket
-
Against the "inventory" scope
-
Against the "hotel" collection
Npte that display_name
is the name that appears in your shell prompt and is not used by commands.
You can also change the active resources with the cb-env
command.
> cb-env -h
Modify the default execution environment of commands
Usage:
> cb-env {flags}
Subcommands:
cb-env bucket - Sets the active bucket based on its name
cb-env capella-organization - Sets the active capella organization based on its identifier
cb-env collection - Sets the active collection based on its name
cb-env database - Sets the active database based on its identifier
cb-env managed - Lists all databases currently managed by couchbase shell
cb-env project - Sets the active project based on its name
cb-env register - Registers a database for use with the shell
cb-env scope - Sets the active scope based on its name
cb-env timeouts - Sets the active timeouts for operations
cb-env unregister - Registers a database for use with the shell
Flags:
-h, --help - Display the help message for this command
--capella - show default execution environment of capella
--timeouts - show default execution environment for timeouts
For example if you change the active bucket:
> cb-env bucket beer-sample
โญโโโโโโโโโฌโโโโโโโโโโโโโโฎ
โ bucket โ beer-sample โ
โฐโโโโโโโโโดโโโโโโโโโโโโโโฏ
> cb-env
โญโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโฎ
โ username โ charlie โ
โ display_name โ Charlie โ
โ database โ capella โ
โ bucket โ beer-sample โ
โ scope โ inventory โ
โ collection โ hotel โ
โ cluster_type โ provisioned โ
โฐโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโฏ
Both the output of cb-env
and the prompt will reflect the changes.
3.1. Per command execution environments
On many commands you will notice a set of flags which allow you to override the active execution environment. Different commands support different flags, depending on the command you can expect to see any of:
-
--databases
-
--bucket
-
--scope
-
--collection
3.1.1. The --databases
flag
The argument for this flag is an identifier combined with a regular expression. So imagine you have three databases setup with the following names:
> cb-env | get database
โโโโฌโโโโโโโโ
0 โ prod-us-west
1 โ prod-us-east
2 โ prod-eu-center
3 โ local-test
โโโโดโโโโโโโโ
If you wanted to run a command against all databases in prod-us
, you could use --databases prod-us.*
, e.g.
> buckets --databases prod-us.*
โโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโ
# โ database โ name โ type โ replicas โ min_durability_level โ ram_quota โ flush_enabled โ status โ capella
โโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโผโโโโโโโ
0 โ prod-us-east โ default โ couchbase โ 1 โ none โ 268.4 MB โ false โ โ false
1 โ prod-us-west โ default โ couchbase โ 1 โ none โ 268.4 MB โ false โ โ false
2 โ prod-us-west โ travel-sample โ couchbase โ 1 โ none โ 209.7 MB โ false โ โ false
โโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโดโโโโโโโ
In the background this gets passed to a regex engine, so you can go a little crazy with it if needed.
3.1.2. The --bucket
, --scope
, --collection
flags
These flags are a little different to the --databases
flag, they are not regular expressions and can only be used to define a single name each.
Unlike --databases
the name provided to these flags does not have to be already known to Couchbase Shell, they can refer to any bucket, scope, and collection that exist within your active database or defined database(s).
For example:
> doc get 1 --bucket travel-sample --scope tenant_agent_00 --collection users
โโโโฌโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโโโ
# โ id โ cas โ content โ error โ database
โโโโผโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโโโ
0 โ 1 โ 1638870288919035904 โ [row 11 columns] โ โ prod-us-west
โโโโดโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโดโโโโโโโโดโโโโโโโโโโโโโโ
4. Couchbase Commands
The following sections discuss the individual couchbase specific commands in greater detail. Remember, you can always mix and match them with built-in other shell commands as well as executables from your environment.
4.1. Working with databases
The cb-env managed
command lists all the databases you have registered with the shell.
> cb-env managed
โญโโโโฌโโโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฎ
โ # โ active โ tls โ identifier โ username โ capella_organization โ
โโโโโผโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโค
โ 0 โ true โ false โ dev.local โ Administrator โ โ
โ 1 โ false โ true โ capella โ charlie โ โ
โฐโโโโดโโโโโโโโโดโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโฏ
4.2. Working with buckets
The buckets
command lists all the buckets from your active database:
> buckets
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโฎ
โ # โ cluster โ name โ type โ replicas โ min_durability_level โ ram_quota โ flush_enabled โ status โ cloud โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโผโโโโโโโโค
โ 0 โ dev.local โ beer-sample โ couchbase โ 2 โ none โ 412.0 MiB โ false โ โ false โ
โ 1 โ dev.local โ default โ couchbase โ 0 โ none โ 512.0 MiB โ false โ โ false โ
โ 2 โ dev.local โ memd โ memcached โ 0 โ none โ 100.0 MiB โ false โ โ false โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโดโโโโโโโโฏ
As an advanced command, it is also possible to get the configuration for a bucket:
> buckets config default
โญโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ authType โ sasl โ
โ autoCompactionSettings โ false โ
โ basicStats โ {record 8 fields} โ
โ bucketCapabilities โ [list 17 items] โ
โ bucketCapabilitiesVer โ โ
โ bucketType โ membase โ
โ collectionsManifestUid โ 21 โ
โ compressionMode โ passive โ
โ conflictResolutionType โ seqno โ
โ controllers โ {record 4 fields} โ
โ ddocs โ {record 1 field} โ
โ durabilityMinLevel โ none โ
โ evictionPolicy โ valueOnly โ
โ localRandomKeyUri โ /pools/default/buckets/default/localRandomKey โ
โ maxTTL โ 0 โ
โ name โ default โ
โ nodeLocator โ vbucket โ
โ nodes โ [table 1 row] โ
โ numVBuckets โ 64 โ
โ pitrEnabled โ false โ
โ pitrGranularity โ 600 โ
โ pitrMaxHistoryAge โ 86400 โ
โ quota โ {record 2 fields} โ
โ replicaIndex โ false โ
โ replicaNumber โ 0 โ
โ stats โ {record 3 fields} โ
โ storageBackend โ couchstore โ
โ streamingUri โ /pools/default/bucketsStreaming/default?bucket_uuid=0ef162c33e14b163630f04639b347937 โ
โ threadsNumber โ 3 โ
โ uri โ /pools/default/buckets/default?bucket_uuid=0ef162c33e14b163630f04639b347937 โ
โ uuid โ 0ef162c33e14b163630f04639b347937 โ
โ vBucketServerMap โ {record 4 fields} โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
If you are unsure what you would use this for, you probably don’t need it.
4.3. Working with scopes
and collections
The scopes
and collections
commands can be used for managing scopes and collection respectively.
4.3.1. Scopes
> scopes -h
Fetches scopes through the HTTP API
Usage:
> scopes <subcommand> {flags}
Subcommands:
scopes create - Creates scopes through the HTTP API
scopes drop - Deletes scopes through the HTTP API
Flags:
-h, --help: Display this help message
--bucket <string>: the name of the bucket
--databases <string>: the databases to query against
To list all scopes in the bucket you would use:
> scopes
โญโโโโฌโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ scope โ cluster โ
โโโโโผโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ inventory โ dev.local โ
โ 1 โ tenant_agent_00 โ dev.local โ
โ 2 โ tenant_agent_01 โ dev.local โ
โ 3 โ tenant_agent_02 โ dev.local โ
โ 4 โ tenant_agent_03 โ dev.local โ
โ 5 โ tenant_agent_04 โ dev.local โ
โ 6 โ _default โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
You can also create and remove scopes:
> scopes create tenant_agent_05
> scopes
โญโโโโฌโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ scope โ cluster โ
โโโโโผโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ tenant_agent_05 โ dev.local โ
โ 1 โ inventory โ dev.local โ
โ 2 โ tenant_agent_00 โ dev.local โ
โ 3 โ tenant_agent_01 โ dev.local โ
โ 4 โ tenant_agent_02 โ dev.local โ
โ 5 โ tenant_agent_03 โ dev.local โ
โ 6 โ tenant_agent_04 โ dev.local โ
โ 7 โ _default โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
> scopes drop tenant_agent_05
> scopes
โญโโโโฌโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ scope โ cluster โ
โโโโโผโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ inventory โ dev.local โ
โ 1 โ tenant_agent_00 โ dev.local โ
โ 2 โ tenant_agent_01 โ dev.local โ
โ 3 โ tenant_agent_02 โ dev.local โ
โ 4 โ tenant_agent_03 โ dev.local โ
โ 5 โ tenant_agent_04 โ dev.local โ
โ 6 โ _default โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
4.3.2. Collections
> collections -h
Fetches collections through the HTTP API
Usage:
> collections <subcommand> {flags}
Subcommands:
collections create - Creates collections through the HTTP API
collections drop - Deletes collections through the HTTP API
Flags:
-h, --help: Display this help message
--bucket <string>: the name of the bucket
--scope <string>: the name of the scope
--databases <string>: the databases to query against
To list all collection in the bucket you would use:
> collections
โญโโโโโฌโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ scope โ collection โ max_expiry โ cluster โ
โโโโโโผโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ inventory โ landmark โ 0sec โ dev.local โ
โ 1 โ inventory โ hotel โ 0sec โ dev.local โ
โ 2 โ inventory โ airport โ 0sec โ dev.local โ
โ 3 โ inventory โ airline โ 0sec โ dev.local โ
โ 4 โ inventory โ route โ 0sec โ dev.local โ
โ 5 โ tenant_agent_00 โ bookings โ 0sec โ dev.local โ
โ 6 โ tenant_agent_00 โ users โ 0sec โ dev.local โ
โ 7 โ tenant_agent_01 โ users โ 0sec โ dev.local โ
โ 8 โ tenant_agent_01 โ bookings โ 0sec โ dev.local โ
โ 9 โ tenant_agent_02 โ users โ 0sec โ dev.local โ
โ 10 โ tenant_agent_02 โ bookings โ 0sec โ dev.local โ
โ 11 โ tenant_agent_03 โ users โ 0sec โ dev.local โ
โ 12 โ tenant_agent_03 โ bookings โ 0sec โ dev.local โ
โ 13 โ tenant_agent_04 โ users โ 0sec โ dev.local โ
โ 14 โ tenant_agent_04 โ bookings โ 0sec โ dev.local โ
โ 15 โ _default โ _default โ 0sec โ dev.local โ
โฐโโโโโดโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
You can also create and remove collections:
> collections create staff --scope tenant_agent_00
> collections --scope tenant_agent_00
โญโโโโฌโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ scope โ collection โ max_expiry โ cluster โ
โโโโโผโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ tenant_agent_00 โ staff โ 0sec โ dev.local โ
โ 1 โ tenant_agent_00 โ bookings โ 0sec โ dev.local โ
โ 2 โ tenant_agent_00 โ users โ 0sec โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
> collections drop staff --scope tenant_agent_00
> collections --scope tenant_agent_00
โญโโโโฌโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ scope โ collection โ max_expiry โ cluster โ
โโโโโผโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ tenant_agent_00 โ bookings โ 0sec โ dev.local โ
โ 1 โ tenant_agent_00 โ users โ 0sec โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
4.4. Listing nodes
The nodes
command allows you to list all the nodes of the database you are currently connected to.
> nodes
โโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโ
# โ database โ hostname โ status โ services โ version โ os โ memory_total โ memory_free
โโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโโ
0 โ remote โ 10.143.200.101:8091 โ healthy โ indexing,kv,query โ 6.5.0-4960-enterprise โ x86_64-unknown-linux-gnu โ 2.1 GB โ 837.7 MB
1 โ remote โ 10.143.200.102:8091 โ healthy โ indexing,kv,query โ 6.5.0-4960-enterprise โ x86_64-unknown-linux-gnu โ 2.1 GB โ 1.0 GB
โโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโโ
4.5. Reading and Writing `doc`uments
The fastest way to interact with documents is through the key value service (as long as you know the document ID).
All those commands are located as subcommands under the doc
namespace.
4.5.1. Reading
You can retrieve a document with doc get
:
> doc get airline_10
โญโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ id โ content โ cas โ error โ cluster โ
โโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ airline_10 โ {record 7 fields} โ 1681456999724089344 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโดโโโโโโโโโโโโฏ
To distinguish the actual content from the metadata, the content is nested in the content
field.
If you want to have everything at the toplevel, you can pipe to the flatten
command:
> doc get airline_10 | flatten
โญโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโโโโฌโโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ id โ callsign โ country โ iata โ icao โ content_id โ name โ type โ cas โ error โ cluster โ
โโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโโโโผโโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ airline_10 โ MILE-AIR โ United States โ Q5 โ MLA โ 10 โ 40-Mile Air โ airline โ 1681456999724089344 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโโโโดโโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโดโโโโโโโโโโโโฏ
If the document is not found, an empty result is returned.
To perform a bulk get operation, the incoming stream can be utilized.
> echo [airline_10 airline_10748 airline_137] | wrap id | doc get
โญโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ id โ content โ cas โ error โ cluster โ
โโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ airline_10 โ {record 7 fields} โ 1681456999724089344 โ โ dev.local โ
โ 1 โ airline_10748 โ {record 7 fields} โ 1681456996753211392 โ โ dev.local โ
โ 2 โ airline_137 โ {record 7 fields} โ 1681457004278579200 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโดโโโโโโโโโโโโฏ
If doc get
operates on an incoming stream it will extract the document id from the id
column.
This behavior can be customized through the --id-column
flag.
4.5.2. Mutating
Documents can be mutated with doc insert
, doc upsert
and doc replace
.
All those three commands take similar arguments. If you only want to mutate a single document, passing in the ID and the content as arguments is the simplest way:
> doc upsert my-doc {"hello": "world"}
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 1 โ 1 โ 0 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโฏ
Multiple documents can be mutated through an input stream as well, defaulting to the id
and content
columns:
4.5.3. Removing
Documents can be removed with doc remove
.
> doc remove airline_10
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 1 โ 1 โ 0 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโฏ
Similar to doc get
, if you want to delete more than one document at the same time, provide a stream of ids with an id
column:
> echo [airline_10 airline_10748 airline_137] | wrap id | doc remove
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 3 โ 2 โ 1 โ Key not found โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
4.6. version
The version
command lists the version of the couchbase shell.
> version
โญโโโโโโโโโโฌโโโโโโโโโฎ
โ version โ 0.75.1 โ
โฐโโโโโโโโโโดโโโโโโโโโฏ
5. Reference
5.1. Config File Format
The ~/.cbsh/config
file with examples:
# Allows us to evolve in the future without breaking old config files
version = 1
[[database]]
identifier = "default"
connstr = "127.0.0.1"
default-bucket = "travel-sample"
default-scope = "my-scope"
default-collection = "my-collection"
# The following can be part of the config or credentials
username = "Administrator"
password = "password"
# TLS defaults to on
# tls-enabled = false
# tls-cert-path = "/path/to/cert" # either accept all certs or provide a cert path
# tls-accept-all-certs = true
# tls-validate-hostnames = false
# User display name is optional and is used to display a different name to the username in the prompt itself.
# This can be useful if the username that you are provided is a long randomly generated string or similar.
# user-display-name = "Charlie"
# Timeouts broadly apply to the operations that you would expect them to.
# That is:
# * data: commands using the kv service such as `doc`
# * query: `query` commands
# * analytics: `analytics` commands
# * search: `search` commands
# * management: commands that perform management level operations, such as `users`, `bucket`, `health` etc...
data-timeout = "10s"
query-timeout = "75s"
analytics-timeout = "75s"
search-timeout = "1m 15s"
management-timeout = "75s"
5.2. Credentials File Format
The optional ~/.cbsh/credentials
file with examples:
# Allows us to evolve in the future without breaking old config files
version = 1
[[database]]
identifier = "default"
username = "Administrator"
password = "password"
# TLS defaults to on, accepting all certs
# tls-enabled = true
# tls-cert-path = "/path/to/cert" # either accept all certs or provide a cert path
# tls-accept-all-certs = true
# tls-validate-hostnames = false
6. Recipes
6.1. Importing data
Couchbase Shell supports loading data from a variety of formats and sources.
The simplest way to import data is using doc import
as covered in Loading data into the shell.
These recipes will cover more advanced usecases.
6.1.1. A Note On Data format
The doc upsert
command requires there to be only two fields/columns for an upsert.
There can be more than two fields/columns in the data but only two can be used.
By default, these two columns are named id
and content
, but these can be overridden with --id-column
and --content-column
.
Given the following document format we need to perform some data manipulation to get it into a format which works with doc upsert
:
> cat mydoc.json
{"id":3719,"cas":1600344369374167040,"type":"airport","airportname":"Columbia Rgnl","city":"Columbia","country":"United States","faa":"COU","icao":"KCOU","tz":"America/Chicago"}
> open mydoc.json | wrap content | insert id { |it| echo $it.content.airportname }
โญโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฎ
โ content โ {record 9 fields} โ
โ id โ Columbia Rgnl โ
โฐโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโฏ
6.1.2. From file
From JSON
Single Document
> cat mydoc.json
{"id":3719,"cas":1600344369374167040,"type":"airport","airportname":"Columbia Rgnl","city":"Columbia","country":"United States","faa":"COU","icao":"KCOU","tz":"America/Chicago"}
> open mydoc.json | wrap content | insert id { |it| echo $it.content.airportname } | doc upsert
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 1 โ 1 โ 0 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโฏ
Multiple Documents
> ls airports
โโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโ
# โ name โ type โ size โ modified
โโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโโผโโโโโโโโโโโโ
0 โ airports/airport_3719.json โ File โ 151 B โ 2 days ago
1 โ airports/airport_3720.json โ File โ 155 B โ 2 days ago
2 โ airports/airport_3721.json โ File โ 172 B โ 2 days ago
3 โ airports/airport_3722.json โ File โ 161 B โ 2 days ago
4 โ airports/airport_3723.json โ File โ 163 B โ 2 days ago
5 โ airports/airport_3724.json โ File โ 156 B โ 2 days ago
6 โ airports/airport_3725.json โ File โ 148 B โ 2 days ago
7 โ airports/airport_3726.json โ File โ 164 B โ 2 days ago
8 โ airports/airport_3727.json โ File โ 169 B โ 2 days ago
9 โ airports/airport_3728.json โ File โ 152 B โ 2 days ago
โโโโดโโโโโโโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโโดโโโโโโโโโโโโ
> open airports/airport_3719.json
โโโโฌโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโฌโโโโโโโฌโโโโโโโโโโโโโโ
# โ id โ type โ airportname โ city โ country โ faa โ icao โ tz
โโโโผโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโผโโโโโโผโโโโโโโผโโโโโโโโโโโโโโ
0 โ 3719 โ airport โ Columbia โ Columbia โ United โ COU โ KCOU โ America/Chic
โ โ โ Rgnl โ โ States โ โ โ ago
โโโโดโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโดโโโโโโดโโโโโโโดโโโโโโโโโโโโโโ
> ls airports/ | each { |it| open $it.name | wrap content | insert id { |doc| echo $doc.content.airportname} } | doc upsert
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 10 โ 10 โ 0 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโฏ
From CSV
Single Document
> cat mydoc.csv
id,cas,type,airportname,city,country,faa,icao,tz
3719,1600344369374167040,airport,Columbia Rgnl,Columbia,United States,COU,KCOU,America/Chicago
> open mydoc.csv | each { |it| wrap content | insert id { |doc| echo $doc.content.airportname} } | doc upsert
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 1 โ 1 โ 0 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโฏ
Multiple Documents
> cat airports.csv
airportname,city,country,faa,icao,id,type,tz
Calais Dunkerque,Calais,France,CQF,LFAC,1254,airport,Europe/Paris
Peronne St Quentin,Peronne,France,,LFAG,1255,airport,Europe/Paris
Les Loges,Nangis,France,,LFAI,1256,airport,Europe/Paris
Couterne,Bagnole-de-l'orne,France,,LFAO,1257,airport,Europe/Paris
Bray,Albert,France,,LFAQ,1258,airport,Europe/Paris
> open mydoc.csv | each { |it| wrap content | insert id { |doc| echo $doc.content.airportname } } | doc upsert
โญโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ processed โ success โ failed โ failures โ cluster โ
โโโโโผโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ 10 โ 10 โ 0 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโฏ
6.1.3. Modifying data
In some circumstances you may want to modify the data before you import it.
Let’s take the example of importing from a csv file but this time the airports.csv file is missing the type
column but we want to add it to our data:
> cat airports.csv
airportname,city,country,faa,icao,id,tz
Calais Dunkerque,Calais,France,CQF,LFAC,1254,Europe/Paris
Peronne St Quentin,Peronne,France,,LFAG,1255,Europe/Paris
Les Loges,Nangis,France,,LFAI,1256,Europe/Paris
Couterne,Bagnole-de-l'orne,France,,LFAO,1257,Europe/Paris
Bray,Albert,France,,LFAQ,1258,Europe/Paris
> open airports.csv | insert type airport
โญโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโฎ
โ # โ airportname โ city โ country โ faa โ icao โ id โ tz โ type โ
โโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโค
โ 0 โ Calais Dunkerque โ Calais โ France โ CQF โ LFAC โ 1254 โ Europe/Paris โ airport โ
โ 1 โ Peronne St Quentin โ Peronne โ France โ โ LFAG โ 1255 โ Europe/Paris โ airport โ
โ 2 โ Les Loges โ Nangis โ France โ โ LFAI โ 1256 โ Europe/Paris โ airport โ
โ 3 โ Couterne โ Bagnole-de-l'orne โ France โ โ LFAO โ 1257 โ Europe/Paris โ airport โ
โ 4 โ Bray โ Albert โ France โ โ LFAQ โ 1258 โ Europe/Paris โ airport โ
โฐโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโฏ
We can also add a column based on data from other columns, for instance adding a type
column which is set to the relevant country:
> open airports.csv | each { |it| insert type $it.city }
โญโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฎ
โ # โ airportname โ city โ country โ faa โ icao โ id โ tz โ type โ
โโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโค
โ 0 โ Calais Dunkerque โ Calais โ France โ CQF โ LFAC โ 1254 โ Europe/Paris โ Calais โ
โ 1 โ Peronne St Quentin โ Peronne โ France โ โ LFAG โ 1255 โ Europe/Paris โ Peronne โ
โ 2 โ Les Loges โ Nangis โ France โ โ LFAI โ 1256 โ Europe/Paris โ Nangis โ
โ 3 โ Couterne โ Bagnole-de-l'orne โ France โ โ LFAO โ 1257 โ Europe/Paris โ Bagnole-de-l'orne โ
โ 4 โ Bray โ Albert โ France โ โ LFAQ โ 1258 โ Europe/Paris โ Albert โ
โฐโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโฏ
6.2. Exporting data
Couchbase Shell supports exporting data to a variety of formats and sources.
6.2.1. A Note On Data format
The doc get
command exposes data as three fields; id
, cas
, and content
.
The body of the document is stored within the content
column.
If you want to only store the document body then you can use doc get <id> | get content
.
6.2.2. To file
To JSON
From KeyValue
> doc get airport_3719 --bucket travel-sample
โญโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ id โ content โ cas โ error โ cluster โ
โโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ airport_3719 โ {record 9 fields} โ 1681456998755270656 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโดโโโโโโโโโโโโฏ
> doc get airport_3719 | get content | save mydoc.json
> cat mydoc.json
{"airportname":"Columbia Rgnl","city":"Columbia","country":"United States","faa":"COU","geo":{"alt":889.0,"lat":38.818094,"lon":-92.219631},"icao":"KCOU","id":3719,"type":"airport","tz":"America/Chicago"}
From Query/Analytics
To Single Document
> query "SELECT `travel-sample`.* from `travel-sample` WHERE `type`='airport' LIMIT 5"
โญโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ airportname โ city โ country โ faa โ geo โ icao โ id โ type โ tz โ database โ
โโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ Calais Dunkerque โ Calais โ France โ CQF โ {record 3 fields} โ LFAC โ 1254 โ airport โ Europe/Paris โ dev.local โ
โ 1 โ Peronne St Quentin โ Peronne โ France โ โ {record 3 fields} โ LFAG โ 1255 โ airport โ Europe/Paris โ dev.local โ
โ 2 โ Les Loges โ Nangis โ France โ โ {record 3 fields} โ LFAI โ 1256 โ airport โ Europe/Paris โ dev.local โ
โ 3 โ Couterne โ Bagnole-de-l'orne โ France โ โ {record 3 fields} โ LFAO โ 1257 โ airport โ Europe/Paris โ dev.local โ
โ 4 โ Bray โ Albert โ France โ โ {record 3 fields} โ LFAQ โ 1258 โ airport โ Europe/Paris โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
> query "SELECT `travel-sample`.* from `travel-sample` WHERE `type`='airport' LIMIT 5" | save airports.json
> cat airports.json
[{"airportname":"Calais Dunkerque","city":"Calais","country":"France","faa":"CQF","geo":{"alt":12,"lat":50.962097,"lon":1.9547640000000002},"icao":"LFAC","id":1254,"type":"airport","tz":"Europe/Paris"},{"airportname":"Peronne St Quentin","city":"Peronne","country":"France","faa":null,"geo":{"alt":295,"lat":49.868547,"lon":3.0295780000000003},"icao":"LFAG","id":1255,"type":"airport","tz":"Europe/Paris"},{"airportname":"Les Loges","city":"Nangis","country":"France","faa":null,"geo":{"alt":428,"lat":48.596219,"lon":3.0067860000000004},"icao":"LFAI","id":1256,"type":"airport","tz":"Europe/Paris"},{"airportname":"Couterne","city":"Bagnole-de-l'orne","country":"France","faa":null,"geo":{"alt":718,"lat":48.545836,"lon":-0.387444},"icao":"LFAO","id":1257,"type":"airport","tz":"Europe/Paris"},{"airportname":"Bray","city":"Albert","country":"France","faa":null,"geo":{"alt":364,"lat":49.971531,"lon":2.697661},"icao":"LFAQ","id":1258,"type":"airport","tz":"Europe/Paris"}]
To Multiple Documents
> query "SELECT `travel-sample`.* from `travel-sample` WHERE `type`='airport' LIMIT 5"
โญโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ airportname โ city โ country โ faa โ geo โ icao โ id โ type โ tz โ database โ
โโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ Calais Dunkerque โ Calais โ France โ CQF โ {record 3 fields} โ LFAC โ 1254 โ airport โ Europe/Paris โ dev.local โ
โ 1 โ Peronne St Quentin โ Peronne โ France โ โ {record 3 fields} โ LFAG โ 1255 โ airport โ Europe/Paris โ dev.local โ
โ 2 โ Les Loges โ Nangis โ France โ โ {record 3 fields} โ LFAI โ 1256 โ airport โ Europe/Paris โ dev.local โ
โ 3 โ Couterne โ Bagnole-de-l'orne โ France โ โ {record 3 fields} โ LFAO โ 1257 โ airport โ Europe/Paris โ dev.local โ
โ 4 โ Bray โ Albert โ France โ โ {record 3 fields} โ LFAQ โ 1258 โ airport โ Europe/Paris โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
> query "SELECT `travel-sample`.* FROM `travel-sample` WHERE `type`='airport' LIMIT 5" | each { |it| echo $it | save (echo (["airport_", $it.id ,".json"] | str join)) }
> ls airport*.json
โญโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโฎ
โ # โ name โ type โ size โ modified โ
โโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโโผโโโโโโโโโโโค
โ 0 โ airport_1254.json โ file โ 277 B โ now โ
โ 1 โ airport_1255.json โ file โ 280 B โ now โ
โ 2 โ airport_1256.json โ file โ 270 B โ now โ
โ 3 โ airport_1257.json โ file โ 281 B โ now โ
โ 4 โ airport_1258.json โ file โ 265 B โ now โ
โฐโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโโดโโโโโโโโโโโฏ
To CSV
From KeyValue
> doc get airport_3719 --bucket travel-sample
โญโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ id โ content โ cas โ error โ cluster โ
โโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ airport_3719 โ {record 9 fields} โ 1681456998755270656 โ โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโดโโโโโโโโโโโโฏ
> doc get airport_3719 --bucket travel-sample | get content
โญโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโฎ
โ # โ airportname โ city โ country โ faa โ geo โ icao โ id โ type โ tz โ
โโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโโโโค
โ 0 โ Columbia Rgnl โ Columbia โ United States โ COU โ {record 3 fields} โ KCOU โ 3719 โ airport โ America/Chicago โ
โฐโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโโโโฏ
The geo
column in the above data contains a record
, which means that the data is nested.
This means that we have to flatten out the geo
column due to limitations of the csv format.
If we try to import the content as it is then we will see:
> doc get airport_3719 --bucket travel-sample | get content | to csv
Error: nu::shell::cant_convert (link)
ร Can't convert to CSV.
โญโ[entry #25:1:1]
1 โ doc get airport_3719 --bucket travel-sample | get content | to csv
ยท โโโโฌโโโ
ยท โฐโโ can't convert table<airportname: string, city: string, country: string, faa: string, geo: record<alt: float, lat: float, lon: float>, icao: string, id: int, type: string, tz: string> to CSV
โฐโโโโ
To flatten out the data we can simply use the flatten
command:
> doc get airport_3719 --bucket travel-sample | get content | flatten
โญโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโโโโโโฌโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโฎ
โ # โ airportname โ city โ country โ faa โ alt โ lat โ lon โ icao โ id โ type โ tz โ
โโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโผโโโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโโโโค
โ 0 โ Columbia Rgnl โ Columbia โ United States โ COU โ 889.0000 โ 38.8181 โ -92.2196 โ KCOU โ 3719 โ airport โ America/Chicago โ
โฐโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโดโโโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโโโโฏ
Which we can then pipe to save:
> doc get airport_3719 --bucket travel-sample | get content | flatten | save mydoc.csv
> cat mydoc.csv
airportname,city,country,faa,alt,lat,lon,icao,id,type,tz
Columbia Rgnl,Columbia,United States,COU,889,38.818094,-92.219631,KCOU,3719,airport,America/Chicago
From Query/Analytics
> query "SELECT `travel-sample`.* from `travel-sample` WHERE `type`='airport' LIMIT 5"
โญโโโโฌโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโฌโโโโโโฌโโโโโโโโโโโโโโโโโโโโฌโโโโโโโฌโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฎ
โ # โ airportname โ city โ country โ faa โ geo โ icao โ id โ type โ tz โ database โ
โโโโโผโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโผโโโโโโผโโโโโโโโโโโโโโโโโโโโผโโโโโโโผโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโผโโโโโโโโโโโโค
โ 0 โ Calais Dunkerque โ Calais โ France โ CQF โ {record 3 fields} โ LFAC โ 1254 โ airport โ Europe/Paris โ dev.local โ
โ 1 โ Peronne St Quentin โ Peronne โ France โ โ {record 3 fields} โ LFAG โ 1255 โ airport โ Europe/Paris โ dev.local โ
โ 2 โ Les Loges โ Nangis โ France โ โ {record 3 fields} โ LFAI โ 1256 โ airport โ Europe/Paris โ dev.local โ
โ 3 โ Couterne โ Bagnole-de-l'orne โ France โ โ {record 3 fields} โ LFAO โ 1257 โ airport โ Europe/Paris โ dev.local โ
โ 4 โ Bray โ Albert โ France โ โ {record 3 fields} โ LFAQ โ 1258 โ airport โ Europe/Paris โ dev.local โ
โฐโโโโดโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโดโโโโโโดโโโโโโโโโโโโโโโโโโโโดโโโโโโโดโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโดโโโโโโโโโโโโฏ
Similar to the above we need to flatten out the geo
data before we can save this to csv:
> query "SELECT `travel-sample`.* from `travel-sample` WHERE `type`='airport' LIMIT 5" | flatten | save mydoc.csv
6.3. Useful snippets
This section contains a collection of useful commands and sets of commands which don’t really fit into their own section of recipes.
6.3.1. Migrating scope and collection definitions
When you create a new cluster it can be useful to migrate scope and collection definitions from an old cluster. A good example here is migrating from an on-premise cluster to a Capella cluster.
To migrate scopes, except the _default
scope:
scopes --databases "On-Prem-Cluster" --bucket travel-sample | select scope | where scope != "_default" | each { |it| scopes create $it.scope --databases "Capella-Cluster" }
To migrate all collections, except the _default
collection:
collections --databases "On-Prem-Cluster" --bucket "travel-sample" | select scope collection | where $it.scope != "_default" | where $it.collection != "_default" | each { |it| collections create $it.collection --databases "Capella-Cluster" --bucket "travel-sample-import" --scope $it.scope
These examples can easily be extended to filter out any other scopes and collections you do not want to migrate.
For example to filter more scopes you would just add more where
clauses: … | where scope != "_default" | where scope != "inventory" | …
6.3.2. Migrating query index definitions
When you create a new cluster it can be useful to migrate index definitions from an old cluster. A good example here is migrating from an on-premise cluster to a Capella cluster.
To migrate all of your index definitions:
query indexes --definitions --databases "On-Prem-Cluster" | get definition | each { |it| query $it --databases "Capella-Cluster" }
7. Release Notes
7.1. 0.75.1 - 2023-04-13
This release contains a number of breaking changes, which are explicitly called out below. As our versioning continues to track the underlying Nushell minor version this has required breaking changes in a patch version.
-
Breaking Updated config file to rename
hostnames
toconnstr
and changed the format to be a string. -
Added support, and detection, for different "cluster types"; Capella and Other. This allows us to modify behaviour based on cluster type.
-
Breaking Renamed
clusters health
tohealth
. -
Breaking Renamed other
clusters …
commands todatabase …
-
Replaced references to cluster with database.
-
Breaking Removed support for whoami
-
Added support for username aliases - added
display_name
to config. -
Trust the system store and Capella root CA when no certificate set.
-
Updated tutorial.
-
Added support for generating a config file when one does not exist.
-
Added support for connecting to kv nodes in parallel.
-
Added support for
doc import
command. -
Added prompt indicator to help distinguish if a Capella or "other" cluster type is in use.
-
Fast fail
buckets
commands when used with Capella. -
Updated where config files are automatically written to.
-
Fixed issue with config.nu file on Windows.
-
Pulled all beta and alpha release versions and updated version numbering
-
Couchbase Shell versions will now map to the Nushell version being used
-
Bundle Capella root CA to allow seamlessly connecting over TLS
-
Automatically detect when
query_context
should be sent -
Update when SRV lookups are performed
-
Statically link OpenSSL
-
Various logging and error enhancements
-
Remove support for Capella InVpc
-
Renamed
clusters managed
tocb-env managed
-
Renamed
clusters register/unregister
tocb-env register/unregister
-
Expose CIDR in result of
clusters
-
Fetch collection id over memcached rather than http
-
7.2. 0.75.0 - 2023-02-09
-
Nushell pinned to 0.75
-
Pulled all beta and alpha release versions and updated version numbering
-
Couchbase Shell versions will now map to the Nushell version being used
-
Bundle Capella root CA to allow seamlessly connecting over TLS
-
Automatically detect when
query_context
should be sent -
Update when SRV lookups are performed
-
Statically link OpenSSL
-
Various logging and error enhancements
-
Remove support for Capella InVpc
-
Renamed
clusters managed
tocb-env managed
-
Renamed
clusters register/unregister
tocb-env register/unregister
-
Expose CIDR in result of
clusters
-
Fetch collection id over memcached rather than http