auto_combine(datasets[, concat_dim, compat, …]) Attempt to auto-magically combine the given datasets into one.
Dataset.nbytes
Dataset.chunks Block dimensions for this dataset’s data or None if it’s not a dask array.
Dataset.all([dim]) Reduce this Dataset’s data by applying all along some dimension(s).
Dataset.any([dim]) Reduce this Dataset’s data by applying any along some dimension(s).
Dataset.argmax([dim, skipna]) Reduce this Dataset’s data by applying argmax along some dimension(s).
Dataset.argmin([dim, skipna]) Reduce this Dataset’s data by applying argmin along some dimension(s).
Dataset.max([dim, skipna]) Reduce this Dataset’s data by applying max along some dimension(s).
Dataset.min([dim, skipna]) Reduce this Dataset’s data by applying min along some dimension(s).
Dataset.mean([dim, skipna]) Reduce this Dataset’s data by applying mean along some dimension(s).
Dataset.median([dim, skipna]) Reduce this Dataset’s data by applying median along some dimension(s).
Dataset.prod([dim, skipna]) Reduce this Dataset’s data by applying prod along some dimension(s).
Dataset.sum([dim, skipna]) Reduce this Dataset’s data by applying sum along some dimension(s).
Dataset.std([dim, skipna]) Reduce this Dataset’s data by applying std along some dimension(s).
Dataset.var([dim, skipna]) Reduce this Dataset’s data by applying var along some dimension(s).
core.coordinates.DatasetCoordinates.get(k[,d])
core.coordinates.DatasetCoordinates.items()
core.coordinates.DatasetCoordinates.keys()
core.coordinates.DatasetCoordinates.merge(other) Merge two sets of coordinates to create a new Dataset
core.coordinates.DatasetCoordinates.to_dataset() Convert these coordinates into a new Dataset
core.coordinates.DatasetCoordinates.to_index(…) Convert all index coordinates into a pandas.Index.
core.coordinates.DatasetCoordinates.update(…)
core.coordinates.DatasetCoordinates.values()
core.coordinates.DatasetCoordinates.dims
core.coordinates.DatasetCoordinates.indexes
core.coordinates.DatasetCoordinates.variables
core.rolling.DatasetCoarsen.all(**kwargs) Reduce this DatasetCoarsen’s data by applying all along some dimension(s).
core.rolling.DatasetCoarsen.any(**kwargs) Reduce this DatasetCoarsen’s data by applying any along some dimension(s).
core.rolling.DatasetCoarsen.argmax(**kwargs) Reduce this DatasetCoarsen’s data by applying argmax along some dimension(s).
core.rolling.DatasetCoarsen.argmin(**kwargs) Reduce this DatasetCoarsen’s data by applying argmin along some dimension(s).
core.rolling.DatasetCoarsen.count(**kwargs) Reduce this DatasetCoarsen’s data by applying count along some dimension(s).
core.rolling.DatasetCoarsen.max(**kwargs) Reduce this DatasetCoarsen’s data by applying max along some dimension(s).
core.rolling.DatasetCoarsen.mean(**kwargs) Reduce this DatasetCoarsen’s data by applying mean along some dimension(s).
core.rolling.DatasetCoarsen.median(**kwargs) Reduce this DatasetCoarsen’s data by applying median along some dimension(s).
core.rolling.DatasetCoarsen.min(**kwargs) Reduce this DatasetCoarsen’s data by applying min along some dimension(s).
core.rolling.DatasetCoarsen.prod(**kwargs) Reduce this DatasetCoarsen’s data by applying prod along some dimension(s).
core.rolling.DatasetCoarsen.std(**kwargs) Reduce this DatasetCoarsen’s data by applying std along some dimension(s).
core.rolling.DatasetCoarsen.sum(**kwargs) Reduce this DatasetCoarsen’s data by applying sum along some dimension(s).
core.rolling.DatasetCoarsen.var(**kwargs) Reduce this DatasetCoarsen’s data by applying var along some dimension(s).
core.rolling.DatasetCoarsen.boundary
core.rolling.DatasetCoarsen.coord_func
core.rolling.DatasetCoarsen.obj
core.rolling.DatasetCoarsen.side
core.rolling.DatasetCoarsen.trim_excess
core.rolling.DatasetCoarsen.windows
core.groupby.DatasetGroupBy.assign(**kwargs) Assign data variables by group.
core.groupby.DatasetGroupBy.assign_coords([…]) Assign coordinates by group.
core.groupby.DatasetGroupBy.first([skipna, …]) Return the first element of each group along the group dimension
core.groupby.DatasetGroupBy.last([skipna, …]) Return the last element of each group along the group dimension
core.groupby.DatasetGroupBy.fillna(value) Fill missing values in this object by group.
core.groupby.DatasetGroupBy.quantile(q[, …]) Compute the qth quantile over each array in the groups and concatenate them together into a new array.
core.groupby.DatasetGroupBy.where(cond[, other]) Return elements from self or other depending on cond.
core.groupby.DatasetGroupBy.all([dim]) Reduce this DatasetGroupBy’s data by applying all along some dimension(s).
core.groupby.DatasetGroupBy.any([dim]) Reduce this DatasetGroupBy’s data by applying any along some dimension(s).
core.groupby.DatasetGroupBy.argmax([dim, skipna]) Reduce this DatasetGroupBy’s data by applying argmax along some dimension(s).
core.groupby.DatasetGroupBy.argmin([dim, skipna]) Reduce this DatasetGroupBy’s data by applying argmin along some dimension(s).
core.groupby.DatasetGroupBy.count([dim]) Reduce this DatasetGroupBy’s data by applying count along some dimension(s).
core.groupby.DatasetGroupBy.max([dim, skipna]) Reduce this DatasetGroupBy’s data by applying max along some dimension(s).
core.groupby.DatasetGroupBy.mean([dim, skipna]) Reduce this DatasetGroupBy’s data by applying mean along some dimension(s).
core.groupby.DatasetGroupBy.median([dim, skipna]) Reduce this DatasetGroupBy’s data by applying median along some dimension(s).
core.groupby.DatasetGroupBy.min([dim, skipna]) Reduce this DatasetGroupBy’s data by applying min along some dimension(s).
core.groupby.DatasetGroupBy.prod([dim, skipna]) Reduce this DatasetGroupBy’s data by applying prod along some dimension(s).
core.groupby.DatasetGroupBy.std([dim, skipna]) Reduce this DatasetGroupBy’s data by applying std along some dimension(s).
core.groupby.DatasetGroupBy.sum([dim, skipna]) Reduce this DatasetGroupBy’s data by applying sum along some dimension(s).
core.groupby.DatasetGroupBy.var([dim, skipna]) Reduce this DatasetGroupBy’s data by applying var along some dimension(s).
core.groupby.DatasetGroupBy.dims
core.groupby.DatasetGroupBy.groups
core.resample.DatasetResample.all([dim]) Reduce this DatasetResample’s data by applying all along some dimension(s).
core.resample.DatasetResample.any([dim]) Reduce this DatasetResample’s data by applying any along some dimension(s).
core.resample.DatasetResample.apply(func[, …]) Backward compatible implementation of map
core.resample.DatasetResample.argmax([dim, …]) Reduce this DatasetResample’s data by applying argmax along some dimension(s).
core.resample.DatasetResample.argmin([dim, …]) Reduce this DatasetResample’s data by applying argmin along some dimension(s).
core.resample.DatasetResample.assign(**kwargs) Assign data variables by group.
core.resample.DatasetResample.assign_coords([…]) Assign coordinates by group.
core.resample.DatasetResample.bfill([tolerance]) Backward fill new values at up-sampled frequency.
core.resample.DatasetResample.count([dim]) Reduce this DatasetResample’s data by applying count along some dimension(s).
core.resample.DatasetResample.ffill([tolerance]) Forward fill new values at up-sampled frequency.
core.resample.DatasetResample.fillna(value) Fill missing values in this object by group.
core.resample.DatasetResample.first([…]) Return the first element of each group along the group dimension
core.resample.DatasetResample.last([skipna, …]) Return the last element of each group along the group dimension
core.resample.DatasetResample.map(func[, …]) Apply a function over each Dataset in the groups generated for resampling and concatenate them together into a new Dataset.
core.resample.DatasetResample.max([dim, skipna]) Reduce this DatasetResample’s data by applying max along some dimension(s).
core.resample.DatasetResample.mean([dim, skipna]) Reduce this DatasetResample’s data by applying mean along some dimension(s).
core.resample.DatasetResample.median([dim, …]) Reduce this DatasetResample’s data by applying median along some dimension(s).
core.resample.DatasetResample.min([dim, skipna]) Reduce this DatasetResample’s data by applying min along some dimension(s).
core.resample.DatasetResample.prod([dim, skipna]) Reduce this DatasetResample’s data by applying prod along some dimension(s).
core.resample.DatasetResample.quantile(q[, …]) Compute the qth quantile over each array in the groups and concatenate them together into a new array.
core.resample.DatasetResample.reduce(func[, …]) Reduce the items in this group by applying func along the pre-defined resampling dimension.
core.resample.DatasetResample.std([dim, skipna]) Reduce this DatasetResample’s data by applying std along some dimension(s).
core.resample.DatasetResample.sum([dim, skipna]) Reduce this DatasetResample’s data by applying sum along some dimension(s).
core.resample.DatasetResample.var([dim, skipna]) Reduce this DatasetResample’s data by applying var along some dimension(s).
core.resample.DatasetResample.where(cond[, …]) Return elements from self or other depending on cond.
core.resample.DatasetResample.dims
core.resample.DatasetResample.groups
core.rolling.DatasetRolling.argmax(**kwargs) Reduce this object’s data windows by applying argmax along its dimension.
core.rolling.DatasetRolling.argmin(**kwargs) Reduce this object’s data windows by applying argmin along its dimension.
core.rolling.DatasetRolling.count() Reduce this object’s data windows by applying count along its dimension.
core.rolling.DatasetRolling.max(**kwargs) Reduce this object’s data windows by applying max along its dimension.
core.rolling.DatasetRolling.mean(**kwargs) Reduce this object’s data windows by applying mean along its dimension.
core.rolling.DatasetRolling.median(**kwargs) Reduce this object’s data windows by applying median along its dimension.
core.rolling.DatasetRolling.min(**kwargs) Reduce this object’s data windows by applying min along its dimension.
core.rolling.DatasetRolling.prod(**kwargs) Reduce this object’s data windows by applying prod along its dimension.
core.rolling.DatasetRolling.std(**kwargs) Reduce this object’s data windows by applying std along its dimension.
core.rolling.DatasetRolling.sum(**kwargs) Reduce this object’s data windows by applying sum along its dimension.
core.rolling.DatasetRolling.var(**kwargs) Reduce this object’s data windows by applying var along its dimension.
core.rolling.DatasetRolling.center
core.rolling.DatasetRolling.dim
core.rolling.DatasetRolling.min_periods
core.rolling.DatasetRolling.obj
core.rolling.DatasetRolling.rollings
core.rolling.DatasetRolling.window
core.rolling_exp.RollingExp.mean() Exponentially weighted moving average
Dataset.argsort([axis, kind, order]) Returns the indices that would sort this array.
Dataset.astype(dtype[, order, casting, …]) Copy of the array, cast to a specified type.
Dataset.clip([min, max, out]) Return an array whose values are limited to [min, max].
Dataset.conj() Complex-conjugate all elements.
Dataset.conjugate() Return the complex conjugate, element-wise.
Dataset.imag
Dataset.round(*args, **kwargs)
Dataset.real
Dataset.cumsum([dim, skipna]) Apply cumsum along some dimension of Dataset.
Dataset.cumprod([dim, skipna]) Apply cumprod along some dimension of Dataset.
Dataset.rank(dim[, pct, keep_attrs]) Ranks the data.
Dataset.load_store(store[, decoder]) Create a new dataset from the contents of a backends.*DataStore object
Dataset.dump_to_store(store, **kwargs) Store dataset contents to a backends.*DataStore object.
DataArray.ndim
DataArray.nbytes
DataArray.shape
DataArray.size
DataArray.dtype
DataArray.nbytes
DataArray.chunks Block dimensions for this array’s data or None if it’s not a dask array.
DataArray.astype(dtype[, order, casting, …]) Copy of the array, cast to a specified type.
DataArray.item(*args) Copy an element of an array to a standard Python scalar and return it.
DataArray.all([dim, axis]) Reduce this DataArray’s data by applying all along some dimension(s).
DataArray.any([dim, axis]) Reduce this DataArray’s data by applying any along some dimension(s).
DataArray.argmax([dim, axis, skipna]) Reduce this DataArray’s data by applying argmax along some dimension(s).
DataArray.argmin([dim, axis, skipna]) Reduce this DataArray’s data by applying argmin along some dimension(s).
DataArray.max([dim, axis, skipna]) Reduce this DataArray’s data by applying max along some dimension(s).
DataArray.min([dim, axis, skipna]) Reduce this DataArray’s data by applying min along some dimension(s).
DataArray.mean([dim, axis, skipna]) Reduce this DataArray’s data by applying mean along some dimension(s).
DataArray.median([dim, axis, skipna]) Reduce this DataArray’s data by applying median along some dimension(s).
DataArray.prod([dim, axis, skipna]) Reduce this DataArray’s data by applying prod along some dimension(s).
DataArray.sum([dim, axis, skipna]) Reduce this DataArray’s data by applying sum along some dimension(s).
DataArray.std([dim, axis, skipna]) Reduce this DataArray’s data by applying std along some dimension(s).
DataArray.var([dim, axis, skipna]) Reduce this DataArray’s data by applying var along some dimension(s).
core.coordinates.DataArrayCoordinates.get(k[,d])
core.coordinates.DataArrayCoordinates.items()
core.coordinates.DataArrayCoordinates.keys()
core.coordinates.DataArrayCoordinates.merge(other) Merge two sets of coordinates to create a new Dataset
core.coordinates.DataArrayCoordinates.to_dataset()
core.coordinates.DataArrayCoordinates.to_index(…) Convert all index coordinates into a pandas.Index.
core.coordinates.DataArrayCoordinates.update(…)
core.coordinates.DataArrayCoordinates.values()
core.coordinates.DataArrayCoordinates.dims
core.coordinates.DataArrayCoordinates.indexes
core.coordinates.DataArrayCoordinates.variables
core.rolling.DataArrayCoarsen.all(**kwargs) Reduce this DataArrayCoarsen’s data by applying all along some dimension(s).
core.rolling.DataArrayCoarsen.any(**kwargs) Reduce this DataArrayCoarsen’s data by applying any along some dimension(s).
core.rolling.DataArrayCoarsen.argmax(**kwargs) Reduce this DataArrayCoarsen’s data by applying argmax along some dimension(s).
core.rolling.DataArrayCoarsen.argmin(**kwargs) Reduce this DataArrayCoarsen’s data by applying argmin along some dimension(s).
core.rolling.DataArrayCoarsen.count(**kwargs) Reduce this DataArrayCoarsen’s data by applying count along some dimension(s).
core.rolling.DataArrayCoarsen.max(**kwargs) Reduce this DataArrayCoarsen’s data by applying max along some dimension(s).
core.rolling.DataArrayCoarsen.mean(**kwargs) Reduce this DataArrayCoarsen’s data by applying mean along some dimension(s).
core.rolling.DataArrayCoarsen.median(**kwargs) Reduce this DataArrayCoarsen’s data by applying median along some dimension(s).
core.rolling.DataArrayCoarsen.min(**kwargs) Reduce this DataArrayCoarsen’s data by applying min along some dimension(s).
core.rolling.DataArrayCoarsen.prod(**kwargs) Reduce this DataArrayCoarsen’s data by applying prod along some dimension(s).
core.rolling.DataArrayCoarsen.std(**kwargs) Reduce this DataArrayCoarsen’s data by applying std along some dimension(s).
core.rolling.DataArrayCoarsen.sum(**kwargs) Reduce this DataArrayCoarsen’s data by applying sum along some dimension(s).
core.rolling.DataArrayCoarsen.var(**kwargs) Reduce this DataArrayCoarsen’s data by applying var along some dimension(s).
core.rolling.DataArrayCoarsen.boundary
core.rolling.DataArrayCoarsen.coord_func
core.rolling.DataArrayCoarsen.obj
core.rolling.DataArrayCoarsen.side
core.rolling.DataArrayCoarsen.trim_excess
core.rolling.DataArrayCoarsen.windows
core.groupby.DataArrayGroupBy.assign_coords([…]) Assign coordinates by group.
core.groupby.DataArrayGroupBy.first([…]) Return the first element of each group along the group dimension
core.groupby.DataArrayGroupBy.last([skipna, …]) Return the last element of each group along the group dimension
core.groupby.DataArrayGroupBy.fillna(value) Fill missing values in this object by group.
core.groupby.DataArrayGroupBy.quantile(q[, …]) Compute the qth quantile over each array in the groups and concatenate them together into a new array.
core.groupby.DataArrayGroupBy.where(cond[, …]) Return elements from self or other depending on cond.
core.groupby.DataArrayGroupBy.all([dim, axis]) Reduce this DataArrayGroupBy’s data by applying all along some dimension(s).
core.groupby.DataArrayGroupBy.any([dim, axis]) Reduce this DataArrayGroupBy’s data by applying any along some dimension(s).
core.groupby.DataArrayGroupBy.argmax([dim, …]) Reduce this DataArrayGroupBy’s data by applying argmax along some dimension(s).
core.groupby.DataArrayGroupBy.argmin([dim, …]) Reduce this DataArrayGroupBy’s data by applying argmin along some dimension(s).
core.groupby.DataArrayGroupBy.count([dim, axis]) Reduce this DataArrayGroupBy’s data by applying count along some dimension(s).
core.groupby.DataArrayGroupBy.max([dim, …]) Reduce this DataArrayGroupBy’s data by applying max along some dimension(s).
core.groupby.DataArrayGroupBy.mean([dim, …]) Reduce this DataArrayGroupBy’s data by applying mean along some dimension(s).
core.groupby.DataArrayGroupBy.median([dim, …]) Reduce this DataArrayGroupBy’s data by applying median along some dimension(s).
core.groupby.DataArrayGroupBy.min([dim, …]) Reduce this DataArrayGroupBy’s data by applying min along some dimension(s).
core.groupby.DataArrayGroupBy.prod([dim, …]) Reduce this DataArrayGroupBy’s data by applying prod along some dimension(s).
core.groupby.DataArrayGroupBy.std([dim, …]) Reduce this DataArrayGroupBy’s data by applying std along some dimension(s).
core.groupby.DataArrayGroupBy.sum([dim, …]) Reduce this DataArrayGroupBy’s data by applying sum along some dimension(s).
core.groupby.DataArrayGroupBy.var([dim, …]) Reduce this DataArrayGroupBy’s data by applying var along some dimension(s).
core.groupby.DataArrayGroupBy.dims
core.groupby.DataArrayGroupBy.groups
core.resample.DataArrayResample.all([dim, axis]) Reduce this DataArrayResample’s data by applying all along some dimension(s).
core.resample.DataArrayResample.any([dim, axis]) Reduce this DataArrayResample’s data by applying any along some dimension(s).
core.resample.DataArrayResample.apply(func) Backward compatible implementation of map
core.resample.DataArrayResample.argmax([…]) Reduce this DataArrayResample’s data by applying argmax along some dimension(s).
core.resample.DataArrayResample.argmin([…]) Reduce this DataArrayResample’s data by applying argmin along some dimension(s).
core.resample.DataArrayResample.assign_coords([…]) Assign coordinates by group.
core.resample.DataArrayResample.bfill([…]) Backward fill new values at up-sampled frequency.
core.resample.DataArrayResample.count([dim, …]) Reduce this DataArrayResample’s data by applying count along some dimension(s).
core.resample.DataArrayResample.ffill([…]) Forward fill new values at up-sampled frequency.
core.resample.DataArrayResample.fillna(value) Fill missing values in this object by group.
core.resample.DataArrayResample.first([…]) Return the first element of each group along the group dimension
core.resample.DataArrayResample.last([…]) Return the last element of each group along the group dimension
core.resample.DataArrayResample.map(func[, …]) Apply a function to each array in the group and concatenate them together into a new array.
core.resample.DataArrayResample.max([dim, …]) Reduce this DataArrayResample’s data by applying max along some dimension(s).
core.resample.DataArrayResample.mean([dim, …]) Reduce this DataArrayResample’s data by applying mean along some dimension(s).
core.resample.DataArrayResample.median([…]) Reduce this DataArrayResample’s data by applying median along some dimension(s).
core.resample.DataArrayResample.min([dim, …]) Reduce this DataArrayResample’s data by applying min along some dimension(s).
core.resample.DataArrayResample.prod([dim, …]) Reduce this DataArrayResample’s data by applying prod along some dimension(s).
core.resample.DataArrayResample.quantile(q) Compute the qth quantile over each array in the groups and concatenate them together into a new array.
core.resample.DataArrayResample.reduce(func) Reduce the items in this group by applying func along some dimension(s).
core.resample.DataArrayResample.std([dim, …]) Reduce this DataArrayResample’s data by applying std along some dimension(s).
core.resample.DataArrayResample.sum([dim, …]) Reduce this DataArrayResample’s data by applying sum along some dimension(s).
core.resample.DataArrayResample.var([dim, …]) Reduce this DataArrayResample’s data by applying var along some dimension(s).
core.resample.DataArrayResample.where(cond) Return elements from self or other depending on cond.
core.resample.DataArrayResample.dims
core.resample.DataArrayResample.groups
core.rolling.DataArrayRolling.argmax(**kwargs) Reduce this object’s data windows by applying argmax along its dimension.
core.rolling.DataArrayRolling.argmin(**kwargs) Reduce this object’s data windows by applying argmin along its dimension.
core.rolling.DataArrayRolling.count() Reduce this object’s data windows by applying count along its dimension.
core.rolling.DataArrayRolling.max(**kwargs) Reduce this object’s data windows by applying max along its dimension.
core.rolling.DataArrayRolling.mean(**kwargs) Reduce this object’s data windows by applying mean along its dimension.
core.rolling.DataArrayRolling.median(**kwargs) Reduce this object’s data windows by applying median along its dimension.
core.rolling.DataArrayRolling.min(**kwargs) Reduce this object’s data windows by applying min along its dimension.
core.rolling.DataArrayRolling.prod(**kwargs) Reduce this object’s data windows by applying prod along its dimension.
core.rolling.DataArrayRolling.std(**kwargs) Reduce this object’s data windows by applying std along its dimension.
core.rolling.DataArrayRolling.sum(**kwargs) Reduce this object’s data windows by applying sum along its dimension.
core.rolling.DataArrayRolling.var(**kwargs) Reduce this object’s data windows by applying var along its dimension.
core.rolling.DataArrayRolling.center
core.rolling.DataArrayRolling.dim
core.rolling.DataArrayRolling.min_periods
core.rolling.DataArrayRolling.obj
core.rolling.DataArrayRolling.window
core.rolling.DataArrayRolling.window_labels
DataArray.argsort([axis, kind, order]) Returns the indices that would sort this array.
DataArray.clip([min, max, out]) Return an array whose values are limited to [min, max].
DataArray.conj() Complex-conjugate all elements.
DataArray.conjugate() Return the complex conjugate, element-wise.
DataArray.imag
DataArray.searchsorted(v[, side, sorter]) Find indices where elements of v should be inserted in a to maintain order.
DataArray.round(*args, **kwargs)
DataArray.real
DataArray.T
DataArray.cumsum([dim, axis, skipna]) Apply cumsum along some dimension of DataArray.
DataArray.cumprod([dim, axis, skipna]) Apply cumprod along some dimension of DataArray.
DataArray.rank(dim, pct, keep_attrs) Ranks the data.
core.accessor_dt.DatetimeAccessor.ceil(freq) Round timestamps upward to specified frequency resolution.
core.accessor_dt.DatetimeAccessor.floor(freq) Round timestamps downward to specified frequency resolution.
core.accessor_dt.DatetimeAccessor.round(freq) Round timestamps to specified frequency resolution.
core.accessor_dt.DatetimeAccessor.strftime(…) Return an array of formatted strings specified by date_format, which supports the same string format as the python standard library.
core.accessor_dt.DatetimeAccessor.day The days of the datetime
core.accessor_dt.DatetimeAccessor.dayofweek The day of the week with Monday=0, Sunday=6
core.accessor_dt.DatetimeAccessor.dayofyear The ordinal day of the year
core.accessor_dt.DatetimeAccessor.days_in_month The number of days in the month
core.accessor_dt.DatetimeAccessor.daysinmonth The number of days in the month
core.accessor_dt.DatetimeAccessor.hour The hours of the datetime
core.accessor_dt.DatetimeAccessor.microsecond The microseconds of the datetime
core.accessor_dt.DatetimeAccessor.minute The minutes of the datetime
core.accessor_dt.DatetimeAccessor.month The month as January=1, December=12
core.accessor_dt.DatetimeAccessor.nanosecond The nanoseconds of the datetime
core.accessor_dt.DatetimeAccessor.quarter The quarter of the date
core.accessor_dt.DatetimeAccessor.season Season of the year
core.accessor_dt.DatetimeAccessor.second The seconds of the datetime
core.accessor_dt.DatetimeAccessor.time Timestamps corresponding to datetimes
core.accessor_dt.DatetimeAccessor.week The week ordinal of the year
core.accessor_dt.DatetimeAccessor.weekday The day of the week with Monday=0, Sunday=6
core.accessor_dt.DatetimeAccessor.weekday_name The name of day in a week
core.accessor_dt.DatetimeAccessor.weekofyear The week ordinal of the year
core.accessor_dt.DatetimeAccessor.year The year of the datetime
core.accessor_str.StringAccessor.capitalize() Convert strings in the array to be capitalized.
core.accessor_str.StringAccessor.center(width) Filling left and right side of strings in the array with an additional character.
core.accessor_str.StringAccessor.contains(pat) Test if pattern or regex is contained within a string of the array.
core.accessor_str.StringAccessor.count(pat) Count occurrences of pattern in each string of the array.
core.accessor_str.StringAccessor.decode(encoding) Decode character string in the array using indicated encoding.
core.accessor_str.StringAccessor.encode(encoding) Encode character string in the array using indicated encoding.
core.accessor_str.StringAccessor.endswith(pat) Test if the end of each string element matches a pattern.
core.accessor_str.StringAccessor.find(sub[, …]) Return lowest or highest indexes in each strings in the array where the substring is fully contained between [start:end].
core.accessor_str.StringAccessor.get(i) Extract element from indexable in each element in the array.
core.accessor_str.StringAccessor.index(sub) Return lowest or highest indexes in each strings where the substring is fully contained between [start:end].
core.accessor_str.StringAccessor.isalnum() Check whether all characters in each string are alphanumeric.
core.accessor_str.StringAccessor.isalpha() Check whether all characters in each string are alphabetic.
core.accessor_str.StringAccessor.isdecimal() Check whether all characters in each string are decimal.
core.accessor_str.StringAccessor.isdigit() Check whether all characters in each string are digits.
core.accessor_str.StringAccessor.islower() Check whether all characters in each string are lowercase.
core.accessor_str.StringAccessor.isnumeric() Check whether all characters in each string are numeric.
core.accessor_str.StringAccessor.isspace() Check whether all characters in each string are spaces.
core.accessor_str.StringAccessor.istitle() Check whether all characters in each string are titlecase.
core.accessor_str.StringAccessor.isupper() Check whether all characters in each string are uppercase.
core.accessor_str.StringAccessor.len() Compute the length of each element in the array.
core.accessor_str.StringAccessor.ljust(width) Filling right side of strings in the array with an additional character.
core.accessor_str.StringAccessor.lower() Convert strings in the array to lowercase.
core.accessor_str.StringAccessor.lstrip([…]) Remove leading and trailing characters.
core.accessor_str.StringAccessor.match(pat) Determine if each string matches a regular expression.
core.accessor_str.StringAccessor.pad(width) Pad strings in the array up to width.
core.accessor_str.StringAccessor.repeat(repeats) Duplicate each string in the array.
core.accessor_str.StringAccessor.replace(…) Replace occurrences of pattern/regex in the array with some string.
core.accessor_str.StringAccessor.rfind(sub) Return highest indexes in each strings in the array where the substring is fully contained between [start:end].
core.accessor_str.StringAccessor.rindex(sub) Return highest indexes in each strings where the substring is fully contained between [start:end].
core.accessor_str.StringAccessor.rjust(width) Filling left side of strings in the array with an additional character.
core.accessor_str.StringAccessor.rstrip([…]) Remove leading and trailing characters.
core.accessor_str.StringAccessor.slice([…]) Slice substrings from each element in the array.
core.accessor_str.StringAccessor.slice_replace([…]) Replace a positional slice of a string with another value.
core.accessor_str.StringAccessor.startswith(pat) Test if the start of each string element matches a pattern.
core.accessor_str.StringAccessor.strip([…]) Remove leading and trailing characters.
core.accessor_str.StringAccessor.swapcase() Convert strings in the array to be swapcased.
core.accessor_str.StringAccessor.title() Convert strings in the array to titlecase.
core.accessor_str.StringAccessor.translate(table) Map all characters in the string through the given mapping table.
core.accessor_str.StringAccessor.upper() Convert strings in the array to uppercase.
core.accessor_str.StringAccessor.wrap(width, …) Wrap long strings in the array to be formatted in paragraphs with length less than a given width.
core.accessor_str.StringAccessor.zfill(width) Pad strings in the array by prepending ‘0’ characters.
Variable.all([dim, axis]) Reduce this Variable’s data by applying all along some dimension(s).
Variable.any([dim, axis]) Reduce this Variable’s data by applying any along some dimension(s).
Variable.argmax([dim, axis, skipna]) Reduce this Variable’s data by applying argmax along some dimension(s).
Variable.argmin([dim, axis, skipna]) Reduce this Variable’s data by applying argmin along some dimension(s).
Variable.argsort([axis, kind, order]) Returns the indices that would sort this array.
Variable.astype(dtype[, order, casting, …]) Copy of the array, cast to a specified type.
Variable.broadcast_equals(other[, equiv]) True if two Variables have the values after being broadcast against each other; otherwise False.
Variable.chunk([chunks, name, lock]) Coerce this array’s data into a dask arrays with the given chunks.
Variable.clip([min, max, out]) Return an array whose values are limited to [min, max].
Variable.coarsen(windows, func[, boundary, side]) Apply reduction function.
Variable.compute(**kwargs) Manually trigger loading of this variable’s data from disk or a remote source into memory and return a new variable.
Variable.concat(variables[, dim, positions, …]) Concatenate variables along a new or existing dimension.
Variable.conj() Complex-conjugate all elements.
Variable.conjugate() Return the complex conjugate, element-wise.
Variable.copy([deep, data]) Returns a copy of this object.
Variable.count([dim, axis]) Reduce this Variable’s data by applying count along some dimension(s).
Variable.cumprod([dim, axis, skipna]) Apply cumprod along some dimension of Variable.
Variable.cumsum([dim, axis, skipna]) Apply cumsum along some dimension of Variable.
Variable.equals(other[, equiv]) True if two Variables have the same dimensions and values; otherwise False.
Variable.fillna(value)
Variable.get_axis_num(dim, Iterable[Hashable]]) Return axis number(s) corresponding to dimension(s) in this array.
Variable.identical(other[, equiv]) Like equals, but also checks attributes.
Variable.isel(indexers, Any] = None, …) Return a new array indexed along the specified dimension(s).
Variable.isnull(*args, **kwargs)
Variable.item(*args) Copy an element of an array to a standard Python scalar and return it.
Variable.load(**kwargs) Manually trigger loading of this variable’s data from disk or a remote source into memory and return this variable.
Variable.max([dim, axis, skipna]) Reduce this Variable’s data by applying max along some dimension(s).
Variable.mean([dim, axis, skipna]) Reduce this Variable’s data by applying mean along some dimension(s).
Variable.median([dim, axis, skipna]) Reduce this Variable’s data by applying median along some dimension(s).
Variable.min([dim, axis, skipna]) Reduce this Variable’s data by applying min along some dimension(s).
Variable.no_conflicts(other[, equiv]) True if the intersection of two Variable’s non-null data is equal; otherwise false.
Variable.notnull(*args, **kwargs)
Variable.pad_with_fill_value([pad_widths, …]) Return a new Variable with paddings.
Variable.prod([dim, axis, skipna]) Reduce this Variable’s data by applying prod along some dimension(s).
Variable.quantile(q[, dim, interpolation, …]) Compute the qth quantile of the data along the specified dimension.
Variable.rank(dim[, pct]) Ranks the data.
Variable.reduce(func[, dim, axis, …]) Reduce this array by applying func along some dimension(s).
Variable.roll([shifts]) Return a new Variable with rolld data.
Variable.rolling_window(dim, window, window_dim) Make a rolling_window along dim and add a new_dim to the last place.
Variable.round(*args, **kwargs)
Variable.searchsorted(v[, side, sorter]) Find indices where elements of v should be inserted in a to maintain order.
Variable.set_dims(dims[, shape]) Return a new variable with given set of dimensions.
Variable.shift([shifts, fill_value]) Return a new Variable with shifted data.
Variable.squeeze([dim]) Return a new object with squeezed data.
Variable.stack([dimensions]) Stack any number of existing dimensions into a single new dimension.
Variable.std([dim, axis, skipna]) Reduce this Variable’s data by applying std along some dimension(s).
Variable.sum([dim, axis, skipna]) Reduce this Variable’s data by applying sum along some dimension(s).
Variable.to_base_variable() Return this variable as a base xarray.Variable
Variable.to_coord() to_coord has been deprecated.
Variable.to_dict([data]) Dictionary representation of variable.
Variable.to_index() Convert this variable to a pandas.Index
Variable.to_index_variable() Return this variable as an xarray.IndexVariable
Variable.to_variable() to_variable has been deprecated.
Variable.transpose(*dims) Return a new Variable object with transposed dimensions.
Variable.unstack([dimensions]) Unstack an existing dimension into multiple new dimensions.
Variable.var([dim, axis, skipna]) Reduce this Variable’s data by applying var along some dimension(s).
Variable.where(cond[, other])
Variable.T
Variable.attrs Dictionary of local attributes on this variable.
Variable.chunks Block dimensions for this array’s data or None if it’s not a dask array.
Variable.data
Variable.dims Tuple of dimension names with which this variable is associated.
Variable.dtype
Variable.encoding Dictionary of encodings on this variable.
Variable.imag
Variable.nbytes
Variable.ndim
Variable.real
Variable.shape
Variable.size
Variable.sizes Ordered mapping from dimension names to lengths.
Variable.values The variable’s data as a numpy.ndarray
IndexVariable.all([dim, axis]) Reduce this Variable’s data by applying all along some dimension(s).
IndexVariable.any([dim, axis]) Reduce this Variable’s data by applying any along some dimension(s).
IndexVariable.argmax([dim, axis, skipna]) Reduce this Variable’s data by applying argmax along some dimension(s).
IndexVariable.argmin([dim, axis, skipna]) Reduce this Variable’s data by applying argmin along some dimension(s).
IndexVariable.argsort([axis, kind, order]) Returns the indices that would sort this array.
IndexVariable.astype(dtype[, order, …]) Copy of the array, cast to a specified type.
IndexVariable.broadcast_equals(other[, equiv]) True if two Variables have the values after being broadcast against each other; otherwise False.
IndexVariable.chunk([chunks, name, lock]) Coerce this array’s data into a dask arrays with the given chunks.
IndexVariable.clip([min, max, out]) Return an array whose values are limited to [min, max].
IndexVariable.coarsen(windows, func[, …]) Apply reduction function.
IndexVariable.compute(**kwargs) Manually trigger loading of this variable’s data from disk or a remote source into memory and return a new variable.
IndexVariable.concat(variables[, dim, …]) Specialized version of Variable.concat for IndexVariable objects.
IndexVariable.conj() Complex-conjugate all elements.
IndexVariable.conjugate() Return the complex conjugate, element-wise.
IndexVariable.copy([deep, data]) Returns a copy of this object.
IndexVariable.count([dim, axis]) Reduce this Variable’s data by applying count along some dimension(s).
IndexVariable.cumprod([dim, axis, skipna]) Apply cumprod along some dimension of Variable.
IndexVariable.cumsum([dim, axis, skipna]) Apply cumsum along some dimension of Variable.
IndexVariable.equals(other[, equiv]) True if two Variables have the same dimensions and values; otherwise False.
IndexVariable.fillna(value)
IndexVariable.get_axis_num(dim, …) Return axis number(s) corresponding to dimension(s) in this array.
IndexVariable.get_level_variable(level) Return a new IndexVariable from a given MultiIndex level.
IndexVariable.identical(other[, equiv]) Like equals, but also checks attributes.
IndexVariable.isel(indexers, Any] = None, …) Return a new array indexed along the specified dimension(s).
IndexVariable.isnull(*args, **kwargs)
IndexVariable.item(*args) Copy an element of an array to a standard Python scalar and return it.
IndexVariable.load() Manually trigger loading of this variable’s data from disk or a remote source into memory and return this variable.
IndexVariable.max([dim, axis, skipna]) Reduce this Variable’s data by applying max along some dimension(s).
IndexVariable.mean([dim, axis, skipna]) Reduce this Variable’s data by applying mean along some dimension(s).
IndexVariable.median([dim, axis, skipna]) Reduce this Variable’s data by applying median along some dimension(s).
IndexVariable.min([dim, axis, skipna]) Reduce this Variable’s data by applying min along some dimension(s).
IndexVariable.no_conflicts(other[, equiv]) True if the intersection of two Variable’s non-null data is equal; otherwise false.
IndexVariable.notnull(*args, **kwargs)
IndexVariable.pad_with_fill_value([…]) Return a new Variable with paddings.
IndexVariable.prod([dim, axis, skipna]) Reduce this Variable’s data by applying prod along some dimension(s).
IndexVariable.quantile(q[, dim, …]) Compute the qth quantile of the data along the specified dimension.
IndexVariable.rank(dim[, pct]) Ranks the data.
IndexVariable.reduce(func[, dim, axis, …]) Reduce this array by applying func along some dimension(s).
IndexVariable.roll([shifts]) Return a new Variable with rolld data.
IndexVariable.rolling_window(dim, window, …) Make a rolling_window along dim and add a new_dim to the last place.
IndexVariable.round(*args, **kwargs)
IndexVariable.searchsorted(v[, side, sorter]) Find indices where elements of v should be inserted in a to maintain order.
IndexVariable.set_dims(dims[, shape]) Return a new variable with given set of dimensions.
IndexVariable.shift([shifts, fill_value]) Return a new Variable with shifted data.
IndexVariable.squeeze([dim]) Return a new object with squeezed data.
IndexVariable.stack([dimensions]) Stack any number of existing dimensions into a single new dimension.
IndexVariable.std([dim, axis, skipna]) Reduce this Variable’s data by applying std along some dimension(s).
IndexVariable.sum([dim, axis, skipna]) Reduce this Variable’s data by applying sum along some dimension(s).
IndexVariable.to_base_variable() Return this variable as a base xarray.Variable
IndexVariable.to_coord() to_coord has been deprecated.
IndexVariable.to_dict([data]) Dictionary representation of variable.
IndexVariable.to_index() Convert this variable to a pandas.Index
IndexVariable.to_index_variable() Return this variable as an xarray.IndexVariable
IndexVariable.to_variable() to_variable has been deprecated.
IndexVariable.transpose(*dims) Return a new Variable object with transposed dimensions.
IndexVariable.unstack([dimensions]) Unstack an existing dimension into multiple new dimensions.
IndexVariable.var([dim, axis, skipna]) Reduce this Variable’s data by applying var along some dimension(s).
IndexVariable.where(cond[, other])
IndexVariable.T
IndexVariable.attrs Dictionary of local attributes on this variable.
IndexVariable.chunks Block dimensions for this array’s data or None if it’s not a dask array.
IndexVariable.data
IndexVariable.dims Tuple of dimension names with which this variable is associated.
IndexVariable.dtype
IndexVariable.encoding Dictionary of encodings on this variable.
IndexVariable.imag
IndexVariable.level_names Return MultiIndex level names or None if this IndexVariable has no MultiIndex.
IndexVariable.name
IndexVariable.nbytes
IndexVariable.ndim
IndexVariable.real
IndexVariable.shape
IndexVariable.size
IndexVariable.sizes Ordered mapping from dimension names to lengths.
IndexVariable.values The variable’s data as a numpy.ndarray
ufuncs.angle xarray specific variant of numpy.angle.
ufuncs.arccos xarray specific variant of numpy.arccos.
ufuncs.arccosh xarray specific variant of numpy.arccosh.
ufuncs.arcsin xarray specific variant of numpy.arcsin.
ufuncs.arcsinh xarray specific variant of numpy.arcsinh.
ufuncs.arctan xarray specific variant of numpy.arctan.
ufuncs.arctan2 xarray specific variant of numpy.arctan2.
ufuncs.arctanh xarray specific variant of numpy.arctanh.
ufuncs.ceil xarray specific variant of numpy.ceil.
ufuncs.conj xarray specific variant of numpy.conj.
ufuncs.copysign xarray specific variant of numpy.copysign.
ufuncs.cos xarray specific variant of numpy.cos.
ufuncs.cosh xarray specific variant of numpy.cosh.
ufuncs.deg2rad xarray specific variant of numpy.deg2rad.
ufuncs.degrees xarray specific variant of numpy.degrees.
ufuncs.exp xarray specific variant of numpy.exp.
ufuncs.expm1 xarray specific variant of numpy.expm1.
ufuncs.fabs xarray specific variant of numpy.fabs.
ufuncs.fix xarray specific variant of numpy.fix.
ufuncs.floor xarray specific variant of numpy.floor.
ufuncs.fmax xarray specific variant of numpy.fmax.
ufuncs.fmin xarray specific variant of numpy.fmin.
ufuncs.fmod xarray specific variant of numpy.fmod.
ufuncs.fmod xarray specific variant of numpy.fmod.
ufuncs.frexp xarray specific variant of numpy.frexp.
ufuncs.hypot xarray specific variant of numpy.hypot.
ufuncs.imag xarray specific variant of numpy.imag.
ufuncs.iscomplex xarray specific variant of numpy.iscomplex.
ufuncs.isfinite xarray specific variant of numpy.isfinite.
ufuncs.isinf xarray specific variant of numpy.isinf.
ufuncs.isnan xarray specific variant of numpy.isnan.
ufuncs.isreal xarray specific variant of numpy.isreal.
ufuncs.ldexp xarray specific variant of numpy.ldexp.
ufuncs.log xarray specific variant of numpy.log.
ufuncs.log10 xarray specific variant of numpy.log10.
ufuncs.log1p xarray specific variant of numpy.log1p.
ufuncs.log2 xarray specific variant of numpy.log2.
ufuncs.logaddexp xarray specific variant of numpy.logaddexp.
ufuncs.logaddexp2 xarray specific variant of numpy.logaddexp2.
ufuncs.logical_and xarray specific variant of numpy.logical_and.
ufuncs.logical_not xarray specific variant of numpy.logical_not.
ufuncs.logical_or xarray specific variant of numpy.logical_or.
ufuncs.logical_xor xarray specific variant of numpy.logical_xor.
ufuncs.maximum xarray specific variant of numpy.maximum.
ufuncs.minimum xarray specific variant of numpy.minimum.
ufuncs.nextafter xarray specific variant of numpy.nextafter.
ufuncs.rad2deg xarray specific variant of numpy.rad2deg.
ufuncs.radians xarray specific variant of numpy.radians.
ufuncs.real xarray specific variant of numpy.real.
ufuncs.rint xarray specific variant of numpy.rint.
ufuncs.sign xarray specific variant of numpy.sign.
ufuncs.signbit xarray specific variant of numpy.signbit.
ufuncs.sin xarray specific variant of numpy.sin.
ufuncs.sinh xarray specific variant of numpy.sinh.
ufuncs.sqrt xarray specific variant of numpy.sqrt.
ufuncs.square xarray specific variant of numpy.square.
ufuncs.tan xarray specific variant of numpy.tan.
ufuncs.tanh xarray specific variant of numpy.tanh.
ufuncs.trunc xarray specific variant of numpy.trunc.
plot.FacetGrid.map_dataarray(func, x, y, …) Apply a plotting function to a 2d facet’s subset of the data.
plot.FacetGrid.set_titles([template, …]) Draw titles either above each facet or on the grid margins.
plot.FacetGrid.set_ticks([max_xticks, …]) Set and control tick behavior
plot.FacetGrid.map(func, *args, **kwargs) Apply a plotting function to each facet’s subset of the data.
CFTimeIndex.all(*args, **kwargs) Return whether all elements are True.
CFTimeIndex.any(*args, **kwargs) Return whether any element is True.
CFTimeIndex.append(other) Append a collection of Index options together.
CFTimeIndex.argmax([axis, skipna]) Return an ndarray of the maximum argument indexer.
CFTimeIndex.argmin([axis, skipna]) Return a ndarray of the minimum argument indexer.
CFTimeIndex.argsort(*args, **kwargs) Return the integer indices that would sort the index.
CFTimeIndex.asof(label) Return the label from the index, or, if not present, the previous one.
CFTimeIndex.asof_locs(where, mask) Find the locations (indices) of the labels from the index for every entry in the where argument.
CFTimeIndex.astype(dtype[, copy]) Create an Index with values cast to dtypes.
CFTimeIndex.contains(key) Needed for .loc based partial-string indexing
CFTimeIndex.copy([name, deep, dtype]) Make a copy of this object.
CFTimeIndex.delete(loc) Make new Index with passed location(-s) deleted.
CFTimeIndex.difference(other[, sort]) Return a new Index with elements from the index that are not in other.
CFTimeIndex.drop(labels[, errors]) Make new Index with passed list of labels deleted.
CFTimeIndex.drop_duplicates([keep]) Return Index with duplicate values removed.
CFTimeIndex.droplevel([level]) Return index with requested level(s) removed.
CFTimeIndex.dropna([how]) Return Index without NA/NaN values
CFTimeIndex.duplicated([keep]) Indicate duplicate index values.
CFTimeIndex.equals(other) Determine if two Index objects contain the same elements.
CFTimeIndex.factorize([sort, na_sentinel]) Encode the object as an enumerated type or categorical variable.
CFTimeIndex.fillna([value, downcast]) Fill NA/NaN values with the specified value
CFTimeIndex.format([name, formatter]) Render a string representation of the Index.
CFTimeIndex.get_indexer(target[, method, …]) Compute indexer and mask for new index given the current index.
CFTimeIndex.get_indexer_for(target, **kwargs) Guaranteed return of an indexer even when non-unique.
CFTimeIndex.get_indexer_non_unique(target) Compute indexer and mask for new index given the current index.
CFTimeIndex.get_level_values(level) Return an Index of values for requested level.
CFTimeIndex.get_loc(key[, method, tolerance]) Adapted from pandas.tseries.index.DatetimeIndex.get_loc
CFTimeIndex.get_slice_bound(label, side, kind) Calculate slice bound that corresponds to given label.
CFTimeIndex.get_value(series, key) Adapted from pandas.tseries.index.DatetimeIndex.get_value
CFTimeIndex.groupby(values) Group the index labels by a given array of values.
CFTimeIndex.holds_integer() Whether the type is an integer type.
CFTimeIndex.identical(other) Similar to equals, but check that other comparable attributes are also equal.
CFTimeIndex.insert(loc, item) Make new Index inserting new item at location.
CFTimeIndex.intersection(other[, sort]) Form the intersection of two Index objects.
CFTimeIndex.is_(other) More flexible, faster check like is but that works through views.
CFTimeIndex.is_boolean()
CFTimeIndex.is_categorical() Check if the Index holds categorical data.
CFTimeIndex.is_floating()
CFTimeIndex.is_integer()
CFTimeIndex.is_interval()
CFTimeIndex.is_mixed()
CFTimeIndex.is_numeric()
CFTimeIndex.is_object()
CFTimeIndex.is_type_compatible(kind) Whether the index type is compatible with the provided type.
CFTimeIndex.isin(values[, level]) Return a boolean array where the index values are in values.
CFTimeIndex.isna() Detect missing values.
CFTimeIndex.isnull() Detect missing values.
CFTimeIndex.item() Return the first element of the underlying data as a python scalar.
CFTimeIndex.join(other[, how, level, …]) Compute join_index and indexers to conform data structures to the new index.
CFTimeIndex.map(mapper[, na_action]) Map values using input correspondence (a dict, Series, or function).
CFTimeIndex.max([axis, skipna]) Return the maximum value of the Index.
CFTimeIndex.memory_usage([deep]) Memory usage of the values
CFTimeIndex.min([axis, skipna]) Return the minimum value of the Index.
CFTimeIndex.notna() Detect existing (non-missing) values.
CFTimeIndex.notnull() Detect existing (non-missing) values.
CFTimeIndex.nunique([dropna]) Return number of unique elements in the object.
CFTimeIndex.putmask(mask, value) Return a new Index of the values set with the mask.
CFTimeIndex.ravel([order]) Return an ndarray of the flattened values of the underlying data.
CFTimeIndex.reindex(target[, method, level, …]) Create index with target’s values (move/add/delete values as necessary).
CFTimeIndex.rename(name[, inplace]) Alter Index or MultiIndex name.
CFTimeIndex.repeat(repeats[, axis]) Repeat elements of a Index.
CFTimeIndex.searchsorted(value[, side, sorter]) Find indices where elements should be inserted to maintain order.
CFTimeIndex.set_names(names[, level, inplace]) Set Index or MultiIndex name.
CFTimeIndex.set_value(arr, key, value) Fast lookup of value from 1-dimensional ndarray.
CFTimeIndex.shift(n, freq) Shift the CFTimeIndex a multiple of the given frequency.
CFTimeIndex.slice_indexer([start, end, …]) For an ordered or unique index, compute the slice indexer for input labels and step.
CFTimeIndex.slice_locs([start, end, step, kind]) Compute slice locations for input labels.
CFTimeIndex.sort(*args, **kwargs) Use sort_values instead.
CFTimeIndex.sort_values([return_indexer, …]) Return a sorted copy of the index.
CFTimeIndex.sortlevel([level, ascending, …]) For internal compatibility with with the Index API.
CFTimeIndex.strftime(date_format) Return an Index of formatted strings specified by date_format, which supports the same string format as the python standard library.
CFTimeIndex.symmetric_difference(other[, …]) Compute the symmetric difference of two Index objects.
CFTimeIndex.take(indices[, axis, …]) Return a new Index of the values selected by the indices.
CFTimeIndex.to_datetimeindex([unsafe]) If possible, convert this index to a pandas.DatetimeIndex.
CFTimeIndex.to_flat_index() Identity method.
CFTimeIndex.to_frame([index, name]) Create a DataFrame with a column containing the Index.
CFTimeIndex.to_list() Return a list of the values.
CFTimeIndex.to_native_types([slicer]) Format specified values of self and return them.
CFTimeIndex.to_numpy([dtype, copy]) A NumPy ndarray representing the values in this Series or Index.
CFTimeIndex.to_series([index, name]) Create a Series with both index and values equal to the index keys useful with map for returning an indexer based on an index.
CFTimeIndex.tolist() Return a list of the values.
CFTimeIndex.transpose(*args, **kwargs) Return the transpose, which is by definition self.
CFTimeIndex.union(other[, sort]) Form the union of two Index objects.
CFTimeIndex.unique([level]) Return unique values in the index.
CFTimeIndex.value_counts([normalize, sort, …]) Return a Series containing counts of unique values.
CFTimeIndex.view([cls])
CFTimeIndex.where(cond[, other]) Return an Index of same shape as self and whose corresponding entries are from self where cond is True and otherwise are from other.
CFTimeIndex.T Return the transpose, which is by definition self.
CFTimeIndex.array The ExtensionArray of the data backing this Series or Index.
CFTimeIndex.asi8 Integer representation of the values.
CFTimeIndex.date_type
CFTimeIndex.day The days of the datetime
CFTimeIndex.dayofweek The day of week of the datetime
CFTimeIndex.dayofyear The ordinal day of year of the datetime
CFTimeIndex.dtype Return the dtype object of the underlying data.
CFTimeIndex.empty
CFTimeIndex.has_duplicates
CFTimeIndex.hasnans Return if I have any nans; enables various perf speedups.
CFTimeIndex.hour The hours of the datetime
CFTimeIndex.inferred_type Return a string of the type inferred from the values.
CFTimeIndex.is_all_dates
CFTimeIndex.is_monotonic Alias for is_monotonic_increasing.
CFTimeIndex.is_monotonic_increasing Return if the index is monotonic increasing (only equal or increasing) values.
CFTimeIndex.is_monotonic_decreasing Return if the index is monotonic decreasing (only equal or decreasing) values.
CFTimeIndex.is_unique Return if the index has unique values.
CFTimeIndex.microsecond The microseconds of the datetime
CFTimeIndex.minute The minutes of the datetime
CFTimeIndex.month The month of the datetime
CFTimeIndex.name
CFTimeIndex.names
CFTimeIndex.nbytes Return the number of bytes in the underlying data.
CFTimeIndex.ndim Number of dimensions of the underlying data, by definition 1.
CFTimeIndex.nlevels Number of levels.
CFTimeIndex.second The seconds of the datetime
CFTimeIndex.shape Return a tuple of the shape of the underlying data.
CFTimeIndex.size Return the number of elements in the underlying data.
CFTimeIndex.values Return an array representing the data in the Index.
CFTimeIndex.year The year of the datetime
backends.NetCDF4DataStore.close(**kwargs)
backends.NetCDF4DataStore.encode(variables, …) Encode the variables and attributes in this store
backends.NetCDF4DataStore.encode_attribute(a) encode one attribute
backends.NetCDF4DataStore.encode_variable(…) encode one variable
backends.NetCDF4DataStore.get(k[,d])
backends.NetCDF4DataStore.get_attrs()
backends.NetCDF4DataStore.get_dimensions()
backends.NetCDF4DataStore.get_encoding()
backends.NetCDF4DataStore.get_variables()
backends.NetCDF4DataStore.items()
backends.NetCDF4DataStore.keys()
backends.NetCDF4DataStore.load() This loads the variables and attributes simultaneously.
backends.NetCDF4DataStore.open(filename[, …])
backends.NetCDF4DataStore.open_store_variable(…)
backends.NetCDF4DataStore.prepare_variable(…)
backends.NetCDF4DataStore.set_attribute(key, …)
backends.NetCDF4DataStore.set_attributes(…) This provides a centralized method to set the dataset attributes on the data store.
backends.NetCDF4DataStore.set_dimension(…)
backends.NetCDF4DataStore.set_dimensions(…) This provides a centralized method to set the dimensions on the data store.
backends.NetCDF4DataStore.set_variable(k, v)
backends.NetCDF4DataStore.set_variables(…) This provides a centralized method to set the variables on the data store.
backends.NetCDF4DataStore.store(variables, …) Top level method for putting data on this store, this method:
backends.NetCDF4DataStore.store_dataset(dataset) in stores, variables are all variables AND coordinates in xarray.Dataset variables are variables NOT coordinates, so here we pass the whole dataset in instead of doing dataset.variables
backends.NetCDF4DataStore.sync()
backends.NetCDF4DataStore.values()
backends.NetCDF4DataStore.attrs
backends.NetCDF4DataStore.autoclose
backends.NetCDF4DataStore.dimensions
backends.NetCDF4DataStore.ds
backends.NetCDF4DataStore.format
backends.NetCDF4DataStore.is_remote
backends.NetCDF4DataStore.lock
backends.NetCDF4DataStore.variables
backends.H5NetCDFStore.close(**kwargs)
backends.H5NetCDFStore.encode(variables, …) Encode the variables and attributes in this store
backends.H5NetCDFStore.encode_attribute(a) encode one attribute
backends.H5NetCDFStore.encode_variable(variable) encode one variable
backends.H5NetCDFStore.get(k[,d])
backends.H5NetCDFStore.get_attrs()
backends.H5NetCDFStore.get_dimensions()
backends.H5NetCDFStore.get_encoding()
backends.H5NetCDFStore.get_variables()
backends.H5NetCDFStore.items()
backends.H5NetCDFStore.keys()
backends.H5NetCDFStore.load() This loads the variables and attributes simultaneously.
backends.H5NetCDFStore.open_store_variable(…)
backends.H5NetCDFStore.prepare_variable(…)
backends.H5NetCDFStore.set_attribute(key, value)
backends.H5NetCDFStore.set_attributes(attributes) This provides a centralized method to set the dataset attributes on the data store.
backends.H5NetCDFStore.set_dimension(name, …)
backends.H5NetCDFStore.set_dimensions(variables) This provides a centralized method to set the dimensions on the data store.
backends.H5NetCDFStore.set_variable(k, v)
backends.H5NetCDFStore.set_variables(…[, …]) This provides a centralized method to set the variables on the data store.
backends.H5NetCDFStore.store(variables, …) Top level method for putting data on this store, this method:
backends.H5NetCDFStore.store_dataset(dataset) in stores, variables are all variables AND coordinates in xarray.Dataset variables are variables NOT coordinates, so here we pass the whole dataset in instead of doing dataset.variables
backends.H5NetCDFStore.sync()
backends.H5NetCDFStore.values()
backends.H5NetCDFStore.attrs
backends.H5NetCDFStore.dimensions
backends.H5NetCDFStore.ds
backends.H5NetCDFStore.variables
backends.PydapDataStore.close()
backends.PydapDataStore.get(k[,d])
backends.PydapDataStore.get_attrs()
backends.PydapDataStore.get_dimensions()
backends.PydapDataStore.get_encoding()
backends.PydapDataStore.get_variables()
backends.PydapDataStore.items()
backends.PydapDataStore.keys()
backends.PydapDataStore.load() This loads the variables and attributes simultaneously.
backends.PydapDataStore.open(url[, session])
backends.PydapDataStore.open_store_variable(var)
backends.PydapDataStore.values()
backends.PydapDataStore.attrs
backends.PydapDataStore.dimensions
backends.PydapDataStore.variables
backends.ScipyDataStore.close()
backends.ScipyDataStore.encode(variables, …) Encode the variables and attributes in this store
backends.ScipyDataStore.encode_attribute(a) encode one attribute
backends.ScipyDataStore.encode_variable(variable) encode one variable
backends.ScipyDataStore.get(k[,d])
backends.ScipyDataStore.get_attrs()
backends.ScipyDataStore.get_dimensions()
backends.ScipyDataStore.get_encoding()
backends.ScipyDataStore.get_variables()
backends.ScipyDataStore.items()
backends.ScipyDataStore.keys()
backends.ScipyDataStore.load() This loads the variables and attributes simultaneously.
backends.ScipyDataStore.open_store_variable(…)
backends.ScipyDataStore.prepare_variable(…)
backends.ScipyDataStore.set_attribute(key, value)
backends.ScipyDataStore.set_attributes(…) This provides a centralized method to set the dataset attributes on the data store.
backends.ScipyDataStore.set_dimension(name, …)
backends.ScipyDataStore.set_dimensions(variables) This provides a centralized method to set the dimensions on the data store.
backends.ScipyDataStore.set_variable(k, v)
backends.ScipyDataStore.set_variables(…[, …]) This provides a centralized method to set the variables on the data store.
backends.ScipyDataStore.store(variables, …) Top level method for putting data on this store, this method:
backends.ScipyDataStore.store_dataset(dataset) in stores, variables are all variables AND coordinates in xarray.Dataset variables are variables NOT coordinates, so here we pass the whole dataset in instead of doing dataset.variables
backends.ScipyDataStore.sync()
backends.ScipyDataStore.values()
backends.ScipyDataStore.attrs
backends.ScipyDataStore.dimensions
backends.ScipyDataStore.ds
backends.ScipyDataStore.variables
backends.FileManager.acquire([needs_lock]) Acquire the file object from this manager.
backends.FileManager.acquire_context([…]) Context manager for acquiring a file.
backends.FileManager.close([needs_lock]) Close the file object associated with this manager, if needed.
backends.CachingFileManager.acquire([needs_lock]) Acquire a file object from the manager.
backends.CachingFileManager.acquire_context([…]) Context manager for acquiring a file.
backends.CachingFileManager.close([needs_lock]) Explicitly close any associated file object (if necessary).
backends.DummyFileManager.acquire([needs_lock]) Acquire the file object from this manager.
backends.DummyFileManager.acquire_context([…]) Context manager for acquiring a file.
backends.DummyFileManager.close([needs_lock]) Close the file object associated with this manager, if needed.