Enter the query into the form above. You can look for specific version of a package by using @ symbol like this: gcc@10.
API method:
GET /api/packages?search=hello&page=1&limit=20
where search is your query, page is a page number and limit is a number of items on a single page. Pagination information (such as a number of pages and etc) is returned
in response headers.
If you'd like to join our channel webring send a patch to ~whereiseveryone/toys@lists.sr.ht adding your channel as an entry in channels.scm.
unyt is a Python library working with data that has physical units. It defines the unyt.array.unyt_array and unyt.array.unyt_quantity classes (subclasses of NumPy’s ndarray class) for handling arrays and scalars with units,respectively
Formulaic is a high-performance implementation of Wilkinson formulas for Python.
Vector is a Python library for 2D and 3D spatial vectors, as well as 4D space-time vectors. It is especially intended for performing geometric calculations on arrays of vectors, rather than one vector at a time in a Python for loop.
Vaex is a high performance Python library for lazy Out-of-Core DataFrames (similar to Pandas), to visualize and explore big tabular datasets. This package provides the core modules of Vaex.
PyTensor is a Python library that allows one to define, optimize, and efficiently evaluate mathematical expressions involving multi-dimensional arrays. It is a fork of the Aesara library.
This package provides a Python library for manipulating indices of ndarrays.
This package lets you generate a multiscale, chunked, multi-dimensional spatial image data structure that can serialized to OME-NGFF. Each scale is a scientific Python Xarray spatial-image Dataset, organized into nodes of an Xarray Datatree.
A Snakemake executor plugin for running srun jobs inside of SLURM jobs (meant for internal use by python-snakemake-executor-plugin-slurm).
This is the Python package for ECOS: Embedded Cone Solver. ECOS is numerical software for solving convex second-order cone programs (SOCPs).
vedo is a fast and lightweight python module for scientific analysis and visualization. The package provides a wide range of functionalities for working with three-dimensional meshes and point clouds. It can also be used to generate high quality two-dimensional renderings such as scatter plots and histograms. vedo is based on vtk and numpy.
This package provides utilities for exploratory analysis of large scale genetic variation data.
This package provides a Python interface for the SCS (Splitting conic solver) library.
climin is a Python package for optimization, heavily biased to machine learning scenarios. It works on top of numpy and (partially) gnumpy.
Einops provides a set of tensor operations for NumPy and multiple deep learning frameworks.
AlgoPy provides a functionality to differentiate functions implemented as computer programs by using Algorithmic Differentiation (AD) techniques in the forward and reverse mode.
The forward mode propagates univariate Taylor polynomials of arbitrary order. Hence it is also possible to use AlgoPy to evaluate higher-order derivative tensors. The reverse mode is also known as backpropagation and can be found in similar form in tools like PyTorch. Speciality of AlgoPy is the possibility to differentiate functions that contain matrix functions as +,-,*,/, dot, solve, qr, eigh, cholesky.
This package provides an extremely lightweight compatibility layer between dataframe libraries.
full API support: cuDF, Modin, pandas, Polars, PyArrow
lazy-only support: Dask, DuckDB, Ibis, PySpark, SQLFrame
Optimized einsum can significantly reduce the overall execution time of einsum-like expressions by optimizing the expression's contraction order and dispatching many operations to canonical BLAS, cuBLAS, or other specialized routines. Optimized einsum is agnostic to the backend and can handle NumPy, Dask, PyTorch, Tensorflow, CuPy, Sparse, Theano, JAX, and Autograd arrays as well as potentially any library which conforms to a standard API. See the documentation for more information.
Snakemake aims to reduce the complexity of creating workflows by providing a clean and modern domain specific specification language (DSL) in Python style, together with a fast and comfortable execution environment.
Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements several methods for sequential model-based optimization. skopt aims to be accessible and easy to use in many contexts.
This package contains public type stubs for python-pandas, following the convention of providing stubs in a separate package, as specified in PEP 561. The stubs cover the most typical use cases of python-pandas. In general, these stubs are narrower than what is possibly allowed by python-pandas, but follow a convention of suggesting best recommended practices for using python-pandas.
This Python module uses matplotlib to visualize multidimensional samples using a scatterplot matrix. In these visualizations, each one- and two-dimensional projection of the sample is plotted to reveal covariances. corner was originally conceived to display the results of Markov Chain Monte Carlo simulations and the defaults are chosen with this application in mind but it can be used for displaying many qualitatively different samples.
Thi package implements a functionality for mean-preserving interpolation of 1D data (for example, time series) with splines.
This is a package meant primarily for documenting histogram indexing and the PlottableHistogram Protocol and any future cross-library standards. It also contains the code for the PlottableHistogram Protocol, to be used in type checking libraries wanting to conform to the protocol. It is not usually a runtime dependency, but only a type checking, testing, and/or docs dependency in support of other libraries.
This package provides a Python library for building and analyzing recommender systems that deal with explicit rating data. It was designed with the following purposes in mind:
Provide tools to handle downloaded or user-provided datasets.
Provide ready-to-use prediction algorithms and similarity measures.
Provide a base for creating custom algorithms.
Provide tools to evaluate, analyse and compare algorithm performance.
Provide documentation with precise details regarding library algorithms.