SQLite

Install
Install Ibis and dependencies for the SQLite backend:
Install with the sqlite extra:
pip install 'ibis-framework[sqlite]'And connect:
import ibis
con = ibis.sqlite.connect()- 1
- Adjust connection parameters as needed.
Install for SQLite:
conda install -c conda-forge ibis-sqliteAnd connect:
import ibis
con = ibis.sqlite.connect()- 1
- Adjust connection parameters as needed.
Install for SQLite:
mamba install -c conda-forge ibis-sqliteAnd connect:
import ibis
con = ibis.sqlite.connect()- 1
- Adjust connection parameters as needed.
Connect
ibis.sqlite.connect
Use an ephemeral, in-memory database.
con = ibis.sqlite.connect()Connect to, or create, a local SQLite file
con = ibis.sqlite.connect("mydb.sqlite")ibis.sqlite.connect is a thin wrapper around ibis.backends.sqlite.Backend.do_connect.
Connection Parameters
do_connect
do_connect(self, database=None, type_map=None)
Create an Ibis client connected to a SQLite database.
Multiple database files can be accessed using the attach() method.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
database |
str | pathlib.Path | None | File path to the SQLite database file. If None, creates an in-memory transient database and you can use attach() to add more files |
None |
type_map |
dict[str, str | ibis.ibis.DataType] | None | An optional mapping from a string name of a SQLite “type” to the corresponding ibis DataType that it represents. This can be used to override schema inference for a given SQLite database. | None |
Examples
>>> import ibis
>>> ibis.sqlite.connect("path/to/my/sqlite.db")ibis.connect URL format
In addition to ibis.sqlite.connect, you can also connect to SQLite by passing a properly-formatted SQLite connection URL to ibis.connect:
con = ibis.connect("sqlite:///path/to/local/file")The URL can be sqlite:// which will connect to an ephemeral in-memory database:
con = ibis.connect("sqlite://")sqlite.Backend
add_operation
add_operation(self, operation)
Add a translation function to the backend for a specific operation.
Operations are defined in ibis.expr.operations, and a translation function receives the translator object and an expression as parameters, and returns a value depending on the backend.
attach
attach(self, name, path)
Connect another SQLite database file to the current connection.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
name |
str | Database name within SQLite | required |
path |
str | pathlib.Path | Path to sqlite3 database files | required |
Examples
>>> con1 = ibis.sqlite.connect("original.db")
>>> con2 = ibis.sqlite.connect("new.db")
>>> con1.attach("new", "new.db")
>>> con1.list_tables(database="new")begin
begin(self)
compile
compile(self, expr, limit=None, params=None, timecontext=None)
Compile an Ibis expression.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Expr | Ibis expression | required |
limit |
str | None | For expressions yielding result sets; retrieve at most this number of values/rows. Overrides any limit already set on the expression. | None |
params |
collections.abc.Mapping[ibis.ibis.Expr, typing.Any] | None | Named unbound parameters | None |
timecontext |
tuple[pandas.pandas.Timestamp, pandas.pandas.Timestamp] | None | Additional information about data source time boundaries | None |
Returns
| Type | Description |
|---|---|
| typing.Any | The output of compilation. The type of this value depends on the backend. |
connect
connect(self, *args, **kwargs)
Connect to the database.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
*args |
Mandatory connection parameters, see the docstring of do_connect for details. |
() |
|
**kwargs |
Extra connection parameters, see the docstring of do_connect for details. |
{} |
Notes
This creates a new backend instance with saved args and kwargs, then calls reconnect and finally returns the newly created and connected backend instance.
Returns
| Type | Description |
|---|---|
| ibis.backends.base.BaseBackend | An instance of the backend |
create_table
create_table(self, name, obj=None, *, schema=None, database=None, temp=False, overwrite=False)
Create a table.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
name |
str | Name of the new table. | required |
obj |
pandas.pandas.DataFrame | pyarrow.pyarrow.Table | ibis.ibis.Table | None | An Ibis table expression or pandas table that will be used to extract the schema and the data of the new table. If not provided, schema must be given. |
None |
schema |
ibis.ibis.Schema | None | The schema for the new table. Only one of schema or obj can be provided. |
None |
database |
str | None | Name of the database where the table will be created, if not the default. | None |
temp |
bool | Should the table be temporary for the session. | False |
overwrite |
bool | Clobber existing data | False |
Returns
| Type | Description |
|---|---|
| Table | The table that was created. |
create_view
create_view(self, name, obj, *, database=None, overwrite=False)
database
database(self, name=None)
Return a Database object for the name database.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
name |
str | None | Name of the database to return the object for. | None |
Returns
| Type | Description |
|---|---|
| ibis.backends.base.Database | A database object for the specified database. |
drop_table
drop_table(self, name, *, database=None, force=False)
Drop a table.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
name |
str | Table to drop | required |
database |
str | None | Database to drop table from | None |
force |
bool | Check for existence before dropping | False |
drop_view
drop_view(self, name, *, database=None, force=False)
execute
execute(self, expr, params=None, limit='default', **kwargs)
Compile and execute an Ibis expression.
Compile and execute Ibis expression using this backend client interface, returning results in-memory in the appropriate object type
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Expr | Ibis expression | required |
limit |
str | For expressions yielding result sets; retrieve at most this number of values/rows. Overrides any limit already set on the expression. | 'default' |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Named unbound parameters | None |
kwargs |
typing.Any | Backend specific arguments. For example, the clickhouse backend uses this to receive external_tables as a dictionary of pandas DataFrames. |
{} |
Returns
| Type | Description |
|---|---|
| DataFrame | Series | Scalar | * Table: pandas.DataFrame * Column: pandas.Series * Scalar: Python scalar value |
fetch_from_cursor
fetch_from_cursor(self, cursor, schema)
has_operation
has_operation(cls, operation)
insert
insert(self, table_name, obj, database=None, overwrite=False)
Insert data into a table.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
table_name |
str | The name of the table to which data needs will be inserted | required |
obj |
pandas.pandas.DataFrame | ibis.ibis.Table | list | dict | The source data or expression to insert | required |
database |
str | None | Name of the attached database that the table is located in. | None |
overwrite |
bool | If True then replace existing contents of table |
False |
Raises
| Type | Description |
|---|---|
| NotImplementedError | If inserting data from a different database |
| ValueError | If the type of obj isn’t supported |
list_databases
list_databases(self, like=None)
List existing databases in the current connection.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
like |
str | None | A pattern in Python’s regex format to filter returned database names. | None |
Returns
| Type | Description |
|---|---|
| list[str] | The database names that exist in the current connection, that match the like pattern if provided. |
list_tables
list_tables(self, like=None, database=None)
raw_sql
raw_sql(self, query)
Execute a query and return the cursor used for execution.
.sql instead
If your query is a SELECT statement you can use the backend .sql method to avoid having to manually release the cursor returned from this method.
You do not need to call .close() on the cursor when running DDL or DML statements like CREATE, INSERT or DROP, only when using SELECT statements.
To release a cursor, call the close method on the returned cursor object.
You can close the cursor by explicitly calling its close method:
cursor = con.raw_sql("SELECT ...")
cursor.close()Or you can use a context manager:
with con.raw_sql("SELECT ...") as cursor:
...Parameters
| Name | Type | Description | Default |
|---|---|---|---|
query |
str | sqlalchemy.sqlalchemy.sql.sqlalchemy.sql.ClauseElement | SQL query or SQLAlchemy expression to execute | required |
Examples
>>> con = ibis.connect("duckdb://")
>>> with con.raw_sql("SELECT 1") as cursor:
... result = cursor.fetchall()
>>> result
[(1,)]
>>> cursor.closed
Trueread_csv
read_csv(self, path, table_name=None, **kwargs)
Register a CSV file as a table in the current backend.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
path |
str | pathlib.Path | The data source. A string or Path to the CSV file. | required |
table_name |
str | None | An optional name to use for the created table. This defaults to a sequentially generated name. | None |
**kwargs |
typing.Any | Additional keyword arguments passed to the backend loading function. | {} |
Returns
| Type | Description |
|---|---|
| ibis.ibis.Table | The just-registered table |
read_delta
read_delta(self, source, table_name=None, **kwargs)
Register a Delta Lake table in the current database.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
source |
str | pathlib.Path | The data source. Must be a directory containing a Delta Lake table. | required |
table_name |
str | None | An optional name to use for the created table. This defaults to a sequentially generated name. | None |
**kwargs |
typing.Any | Additional keyword arguments passed to the underlying backend or library. | {} |
Returns
| Type | Description |
|---|---|
| ibis.ibis.Table | The just-registered table. |
read_json
read_json(self, path, table_name=None, **kwargs)
Register a JSON file as a table in the current backend.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
path |
str | pathlib.Path | The data source. A string or Path to the JSON file. | required |
table_name |
str | None | An optional name to use for the created table. This defaults to a sequentially generated name. | None |
**kwargs |
typing.Any | Additional keyword arguments passed to the backend loading function. | {} |
Returns
| Type | Description |
|---|---|
| ibis.ibis.Table | The just-registered table |
read_parquet
read_parquet(self, path, table_name=None, **kwargs)
Register a parquet file as a table in the current backend.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
path |
str | pathlib.Path | The data source. | required |
table_name |
str | None | An optional name to use for the created table. This defaults to a sequentially generated name. | None |
**kwargs |
typing.Any | Additional keyword arguments passed to the backend loading function. | {} |
Returns
| Type | Description |
|---|---|
| ibis.ibis.Table | The just-registered table |
reconnect
reconnect(self)
Reconnect to the database already configured with connect.
register_options
register_options(cls)
Register custom backend options.
rename_table
rename_table(self, old_name, new_name)
Rename an existing table.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
old_name |
str | The old name of the table. | required |
new_name |
str | The new name of the table. | required |
schema
schema(self, name)
Get an ibis schema from the current database for the table name.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
name |
str | Table name | required |
Returns
| Type | Description |
|---|---|
| Schema | The ibis schema of name |
sql
sql(self, query, schema=None, dialect=None)
Convert a SQL query to an Ibis table expression.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
query |
str | SQL string | required |
schema |
ibis.ibis.Schema | None | The expected schema for this query. If not provided, will be inferred automatically if possible. | None |
dialect |
str | None | Optional string indicating the dialect of query. The default value of None will use the backend’s native dialect. |
None |
Returns
| Type | Description |
|---|---|
| Table | Table expression |
table
table(self, name, database=None, schema=None)
Create a table expression from a table in the database.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
name |
str | Table name | required |
database |
str | None | The database the table resides in | None |
schema |
str | None | The schema inside database where the table resides. ::: {.callout-warning} ## schema refers to database hierarchy The schema parameter does not refer to the column names and types of table. ::: |
None |
Returns
| Type | Description |
|---|---|
| Table | Table expression |
to_csv
to_csv(self, expr, path, *, params=None, **kwargs)
Write the results of executing the given expression to a CSV file.
This method is eager and will execute the associated expression immediately.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Table | The ibis expression to execute and persist to CSV. | required |
path |
str | pathlib.Path | The data source. A string or Path to the CSV file. | required |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Mapping of scalar parameter expressions to value. | None |
kwargs |
typing.Any | Additional keyword arguments passed to pyarrow.csv.CSVWriter | {} |
https |
required |
to_delta
to_delta(self, expr, path, *, params=None, **kwargs)
Write the results of executing the given expression to a Delta Lake table.
This method is eager and will execute the associated expression immediately.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Table | The ibis expression to execute and persist to Delta Lake table. | required |
path |
str | pathlib.Path | The data source. A string or Path to the Delta Lake table. | required |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Mapping of scalar parameter expressions to value. | None |
kwargs |
typing.Any | Additional keyword arguments passed to deltalake.writer.write_deltalake method | {} |
to_pandas
to_pandas(self, expr, *, params=None, limit=None, **kwargs)
Execute an Ibis expression and return a pandas DataFrame, Series, or scalar.
This method is a wrapper around execute.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Expr | Ibis expression to execute. | required |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Mapping of scalar parameter expressions to value. | None |
limit |
int | str | None | An integer to effect a specific row limit. A value of None means “no limit”. The default is in ibis/config.py. |
None |
kwargs |
typing.Any | Keyword arguments | {} |
to_pandas_batches
to_pandas_batches(self, expr, *, params=None, limit=None, chunk_size=1000000, **kwargs)
Execute an Ibis expression and return an iterator of pandas DataFrames.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Expr | Ibis expression to execute. | required |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Mapping of scalar parameter expressions to value. | None |
limit |
int | str | None | An integer to effect a specific row limit. A value of None means “no limit”. The default is in ibis/config.py. |
None |
chunk_size |
int | Maximum number of rows in each returned DataFrame batch. This may have no effect depending on the backend. |
1000000 |
kwargs |
typing.Any | Keyword arguments | {} |
Returns
| Type | Description |
|---|---|
| collections.abc.Iterator[pandas.pandas.DataFrame] | An iterator of pandas DataFrames. |
to_parquet
to_parquet(self, expr, path, *, params=None, **kwargs)
Write the results of executing the given expression to a parquet file.
This method is eager and will execute the associated expression immediately.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Table | The ibis expression to execute and persist to parquet. | required |
path |
str | pathlib.Path | The data source. A string or Path to the parquet file. | required |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Mapping of scalar parameter expressions to value. | None |
**kwargs |
typing.Any | Additional keyword arguments passed to pyarrow.parquet.ParquetWriter | {} |
https |
required |
to_pyarrow
to_pyarrow(self, expr, *, params=None, limit=None, **kwargs)
Execute expression and return results in as a pyarrow table.
This method is eager and will execute the associated expression immediately.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Expr | Ibis expression to export to pyarrow | required |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Mapping of scalar parameter expressions to value. | None |
limit |
int | str | None | An integer to effect a specific row limit. A value of None means “no limit”. The default is in ibis/config.py. |
None |
kwargs |
typing.Any | Keyword arguments | {} |
Returns
| Type | Description |
|---|---|
| Table | A pyarrow table holding the results of the executed expression. |
to_pyarrow_batches
to_pyarrow_batches(self, expr, *, params=None, limit=None, chunk_size=1000000, **_)
Execute expression and return an iterator of pyarrow record batches.
This method is eager and will execute the associated expression immediately.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Expr | Ibis expression to export to pyarrow | required |
limit |
int | str | None | An integer to effect a specific row limit. A value of None means “no limit”. The default is in ibis/config.py. |
None |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Mapping of scalar parameter expressions to value. | None |
chunk_size |
int | Maximum number of rows in each returned record batch. | 1000000 |
Returns
| Type | Description |
|---|---|
| RecordBatchReader | Collection of pyarrow RecordBatchs. |
to_torch
to_torch(self, expr, *, params=None, limit=None, **kwargs)
Execute an expression and return results as a dictionary of torch tensors.
Parameters
| Name | Type | Description | Default |
|---|---|---|---|
expr |
ibis.ibis.Expr | Ibis expression to execute. | required |
params |
collections.abc.Mapping[ibis.ibis.Scalar, typing.Any] | None | Parameters to substitute into the expression. | None |
limit |
int | str | None | An integer to effect a specific row limit. A value of None means no limit. |
None |
kwargs |
typing.Any | Keyword arguments passed into the backend’s to_torch implementation. |
{} |
Returns
| Type | Description |
|---|---|
| dict[str, torch.torch.Tensor] | A dictionary of torch tensors, keyed by column name. |
truncate_table
truncate_table(self, name, database=None)