Skip to main content
Version: 1.3.8

PandasGoogleCloudStorageDatasource

Signature

class great_expectations.datasource.fluent.PandasGoogleCloudStorageDatasource(*, type: Literal['pandas_gcs'] = 'pandas_gcs', name: str, id: Optional[uuid.UUID] = None, assets: List[great_expectations.datasource.fluent.data_asset.path.file_asset.FileDataAsset] = [], bucket_or_name: str, gcs_options: Dict[str, Union[great_expectations.datasource.fluent.config_str.ConfigStr, Any]] = {})

PandasGoogleCloudStorageDatasource is a PandasDatasource that uses Google Cloud Storage as a data store.

Methods

add_csv_asset

Signature

add_csv_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd4132b40> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd4132c00> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd4132d50> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd4132f00> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd4132fc0> = None, sep: typing.Optional[str] = None, delimiter: typing.Optional[str] = None, header: Union[int, Sequence[int], None, Literal['infer']] = 'infer', names: Union[Sequence[str], None] = None, index_col: Union[IndexLabel, Literal[False], None] = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, dtype: typing.Optional[dict] = None, engine: Union[CSVEngine, None] = None, true_values: typing.Optional[typing.List] = None, false_values: typing.Optional[typing.List] = None, skipinitialspace: bool = False, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, skipfooter: int = 0, nrows: typing.Optional[int] = None, na_values: Union[Sequence[str], None] = None, keep_default_na: bool = True, na_filter: bool = True, verbose: bool = False, skip_blank_lines: bool = True, parse_dates: Union[bool, Sequence[str], None] = None, infer_datetime_format: bool = None, keep_date_col: bool = False, date_format: typing.Optional[str] = None, dayfirst: bool = False, cache_dates: bool = True, iterator: bool = False, chunksize: typing.Optional[int] = None, compression: CompressionOptions = 'infer', thousands: typing.Optional[str] = None, decimal: str = '.', lineterminator: typing.Optional[str] = None, quotechar: str = '"', quoting: int = 0, doublequote: bool = True, escapechar: typing.Optional[str] = None, comment: typing.Optional[str] = None, encoding: typing.Optional[str] = None, encoding_errors: typing.Optional[str] = 'strict', dialect: typing.Optional[str] = None, on_bad_lines: str = 'error', delim_whitespace: bool = False, low_memory: bool = True, memory_map: bool = False, float_precision: Union[Literal['high', 'legacy'], None] = None, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a csv asset to the datasource.

add_excel_asset

Signature

add_excel_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd4164e90> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd4164bf0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd41645c0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd41646b0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd4164f50> = None, sheet_name: typing.Optional[typing.Union[str, int, typing.List[typing.Union[int, str]]]] = 0, header: Union[int, Sequence[int], None] = 0, names: typing.Optional[typing.List[str]] = None, index_col: Union[int, Sequence[int], None] = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, dtype: typing.Optional[dict] = None, engine: Union[Literal['xlrd', 'openpyxl', 'odf', 'pyxlsb'], None] = None, true_values: Union[Iterable[str], None] = None, false_values: Union[Iterable[str], None] = None, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, nrows: typing.Optional[int] = None, na_values: typing.Any = None, keep_default_na: bool = True, na_filter: bool = True, verbose: bool = False, parse_dates: typing.Union[typing.List, typing.Dict, bool] = False, date_format: typing.Optional[str] = None, thousands: typing.Optional[str] = None, decimal: str = '.', comment: typing.Optional[str] = None, skipfooter: int = 0, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, engine_kwargs: typing.Optional[typing.Dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add an excel asset to the datasource.

add_feather_asset

Signature

add_feather_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd4165fa0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd4165940> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd41661b0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd4166360> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd4166420> = None, columns: Union[Sequence[str], None] = None, use_threads: bool = True, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a feather asset to the datasource.

add_fwf_asset

Signature

add_fwf_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd4166ba0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd4166c60> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd4166db0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd4166f60> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd4167020> = None, colspecs: Union[Sequence[Tuple[int, int]], str, None] = 'infer', widths: Union[Sequence[int], None] = None, infer_nrows: int = 100, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a fwf asset to the datasource.

add_hdf_asset

Signature

add_hdf_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd41678c0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd4167980> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd4167ad0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd4167c80> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd4167d40> = None, key: typing.Any = None, mode: str = 'r', errors: str = 'strict', where: typing.Optional[typing.Union[str, typing.List]] = None, start: typing.Optional[int] = None, stop: typing.Optional[int] = None, columns: typing.Optional[typing.List[str]] = None, iterator: bool = False, chunksize: typing.Optional[int] = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a hdf asset to the datasource.

add_html_asset

Signature

add_html_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f98500> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f985c0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f98710> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f988c0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f98980> = None, match: Union[str, Pattern] = '.+', flavor: typing.Optional[str] = None, header: Union[int, Sequence[int], None] = None, index_col: Union[int, Sequence[int], None] = None, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, attrs: typing.Optional[typing.Dict[str, str]] = None, parse_dates: bool = False, thousands: typing.Optional[str] = ',', encoding: typing.Optional[str] = None, decimal: str = '.', converters: typing.Optional[typing.Dict] = None, na_values: Union[Iterable[object], None] = None, keep_default_na: bool = True, displayed_only: bool = True, extract_links: Literal[None, 'header', 'footer', 'body', 'all'] = None, dtype_backend: DtypeBackend = None, storage_options: StorageOptions = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a html asset to the datasource.

add_json_asset

Signature

add_json_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f996a0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f99760> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f998b0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f99a60> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f99b20> = None, orient: typing.Optional[str] = None, typ: Literal['frame', 'series'] = 'frame', dtype: typing.Optional[dict] = None, convert_axes: typing.Optional[bool] = None, convert_dates: typing.Union[bool, typing.List[str]] = True, keep_default_dates: bool = True, precise_float: bool = False, date_unit: typing.Optional[str] = None, encoding: typing.Optional[str] = None, encoding_errors: typing.Optional[str] = 'strict', lines: bool = False, chunksize: typing.Optional[int] = None, compression: CompressionOptions = 'infer', nrows: typing.Optional[int] = None, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a json asset to the datasource.

add_orc_asset

Signature

add_orc_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9a690> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9a750> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9a8a0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9aa50> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9ab10> = None, columns: typing.Optional[typing.List[str]] = None, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add an orc asset to the datasource.

add_parquet_asset

Signature

add_parquet_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9b230> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9b2f0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9b440> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9b5f0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9b6b0> = None, engine: str = 'auto', columns: typing.Optional[typing.List[str]] = None, storage_options: Union[StorageOptions, None] = None, use_nullable_dtypes: bool = None, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a parquet asset to the datasource.

add_pickle_asset

Signature

add_pickle_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9be90> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3f9bf50> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcc0e0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcc290> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcc350> = None, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a pickle asset to the datasource.

add_sas_asset

Signature

add_sas_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcca40> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fccb00> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fccc50> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcce00> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fccec0> = None, format: typing.Optional[str] = None, index: typing.Optional[str] = None, encoding: typing.Optional[str] = None, chunksize: typing.Optional[int] = None, iterator: bool = False, compression: CompressionOptions = 'infer', **extra_data: typing.Any) → pydantic.BaseModel

Add a sas asset to the datasource.

add_spss_asset

Signature

add_spss_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcd640> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcd700> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcd850> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcda00> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcdac0> = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, convert_categoricals: bool = True, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a spss asset to the datasource.

add_stata_asset

Signature

add_stata_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fce2a0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fce360> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fce4b0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fce660> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fce720> = None, convert_dates: bool = True, convert_categoricals: bool = True, index_col: typing.Optional[str] = None, convert_missing: bool = False, preserve_dtypes: bool = True, columns: Union[Sequence[str], None] = None, order_categoricals: bool = True, chunksize: typing.Optional[int] = None, iterator: bool = False, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a stata asset to the datasource.

add_xml_asset

Signature

add_xml_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcf020> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcf0e0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcf230> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcf3e0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f3cd3fcf4a0> = None, xpath: str = './*', namespaces: typing.Optional[typing.Dict[str, str]] = None, elems_only: bool = False, attrs_only: bool = False, names: Union[Sequence[str], None] = None, dtype: typing.Optional[dict] = None, encoding: typing.Optional[str] = 'utf-8', stylesheet: Union[FilePath, None] = None, iterparse: typing.Optional[typing.Dict[str, typing.List[str]]] = None, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a xml asset to the datasource.

delete_asset

Signature

delete_asset(name: str)None

Removes the DataAsset referred to by asset_name from internal list of available DataAsset objects.

Parameters

NameDescription

name

name of DataAsset to be deleted.

get_asset

Signature

get_asset(name: str) → great_expectations.datasource.fluent.interfaces._DataAssetT

Returns the DataAsset referred to by asset_name

Parameters

NameDescription

name

name of DataAsset sought.

Returns

TypeDescription

great_expectations.datasource.fluent.interfaces._DataAssetT

if named "DataAsset" object exists; otherwise, exception is raised.