Enter the query into the form above. You can look for specific version of a package by using @ symbol like this: gcc@10.
API method:
GET /api/packages?search=hello&page=1&limit=20
where search is your query, page is a page number and limit is a number of items on a single page. Pagination information (such as a number of pages and etc) is returned
in response headers.
If you'd like to join our channel webring send a patch to ~whereiseveryone/toys@lists.sr.ht adding your channel as an entry in channels.scm.
Modin uses Ray or Dask to provide an effortless way to speed up your pandas notebooks, scripts, and libraries. Unlike other distributed DataFrame libraries, Modin provides seamless integration and compatibility with existing pandas code.
unyt is a Python library working with data that has physical units. It defines the unyt.array.unyt_array and unyt.array.unyt_quantity classes (subclasses of NumPy’s ndarray class) for handling arrays and scalars with units,respectively
python-pandera provides a flexible and expressive API for performing data validation on dataframe-like objects to make data processing pipelines more readable and robust. Dataframes contain information that python-pandera explicitly validates at runtime. This is useful in production-critical data pipelines or reproducible research settings. With python-pandera, you can:
Define a schema once and use it to validate different dataframe types.
Check the types and properties of columns.
Perform more complex statistical validation like hypothesis testing.
Seamlessly integrate with existing data pipelines via function decorators.
Define dataframe models with the class-based API with pydantic-style syntax.
Synthesize data from schema objects for property-based testing.
Lazily validate dataframes so that all validation rules are executed.
Integrate with a rich ecosystem of tools like
python-pydantic,python-fastapiandpython-mypy.
Scikit-image is a collection of algorithms for image processing.
AlgoPy provides a functionality to differentiate functions implemented as computer programs by using Algorithmic Differentiation (AD) techniques in the forward and reverse mode.
The forward mode propagates univariate Taylor polynomials of arbitrary order. Hence it is also possible to use AlgoPy to evaluate higher-order derivative tensors. The reverse mode is also known as backpropagation and can be found in similar form in tools like PyTorch. Speciality of AlgoPy is the possibility to differentiate functions that contain matrix functions as +,-,*,/, dot, solve, qr, eigh, cholesky.
A LEMS simulator written in Python which can be used to run NeuroML2 models.
Einops provides a set of tensor operations for NumPy and multiple deep learning frameworks.
A Snakemake executor plugin for running srun jobs inside of SLURM jobs (meant for internal use by python-snakemake-executor-plugin-slurm).
Histoprint uses a mix of terminal color codes and Unicode trickery (i.e. combining characters) to plot overlaying histograms.
This package provides utilities for exploratory analysis of large scale genetic variation data.
This package implements a functionality to create and manipulate plot legends for matplotlib.
pykdtree is a kd-tree implementation for fast nearest neighbour search in Python.
This package provides an extremely lightweight compatibility layer between dataframe libraries.
full API support: cuDF, Modin, pandas, Polars, PyArrow
lazy-only support: Dask, DuckDB, Ibis, PySpark, SQLFrame
This package implements schema validation for Xarray objects.
Dvc objects provides a filesystem and object-db level abstractions to use in dvc and dvc-data.
SALib provides tools for global sensitivity analysis. It contains Sobol', Morris, FAST, DGSM, PAWN, HDMR, Moment Independent and fractional factorial methods.
This package provides a set of tools and Python modules for setting up, manipulating, running, visualizing and analyzing atomistic simulations.
This package provides a Python package for time series classification.
Scikit-image is a collection of algorithms for image processing.
This is a package meant primarily for documenting histogram indexing and the PlottableHistogram Protocol and any future cross-library standards. It also contains the code for the PlottableHistogram Protocol, to be used in type checking libraries wanting to conform to the protocol. It is not usually a runtime dependency, but only a type checking, testing, and/or docs dependency in support of other libraries.
This package provides a simplified scipy.signal.spectral module to do spectral analysis in Python.
This package provides an efficient implementation of Friedman's SuperSmoother based in Python. It makes use of numpy for fast numerical computation.
PyZX is a Python tool implementing the theory of ZX-calculus for the creation, visualisation, and automated rewriting of large-scale quantum circuits. PyZX currently allows you to:
Read in quantum circuits in the file format of QASM, Quipper or Quantomatic;
Rewrite circuits into a pseudo-normal form using the ZX-calculus;
Extract new simplified circuits from these reduced graphs;
Visualise the ZX-graphs and rewrites using either Matplotlib, Quantomatic or as a TikZ file for use in LaTeX documents;
Output the optimised circuits in QASM, QC or QUIPPER format.
This package contains public type stubs for python-pandas, following the convention of providing stubs in a separate package, as specified in PEP 561. The stubs cover the most typical use cases of python-pandas. In general, these stubs are narrower than what is possibly allowed by python-pandas, but follow a convention of suggesting best recommended practices for using python-pandas.