Enter the query into the form above. You can look for specific version of a package by using @ symbol like this: gcc@10.
API method:
GET /api/packages?search=hello&page=1&limit=20
where search is your query, page is a page number and limit is a number of items on a single page. Pagination information (such as a number of pages and etc) is returned
in response headers.
If you'd like to join our channel webring send a patch to ~whereiseveryone/toys@lists.sr.ht adding your channel as an entry in channels.scm.
This package is a toolbox for optimization on Riemannian manifolds with support for automatic differentiation.
Scikit-rebate is a scikit-learn-compatible Python implementation of ReBATE, a suite of Relief-based feature selection algorithms for Machine Learning. These algorithms excel at identifying features that are predictive of the outcome in supervised learning problems, and are especially good at identifying feature interactions that are normally overlooked by standard feature selection algorithms.
This package provides a generic API for dispatch to Pyro backends.
This package provides common Python utilities and GitHub Actions for the Lightning suite of libraries.
This framework provides an easy method to compute dense vector representations for sentences, paragraphs, and images. The models are based on transformer networks like BERT / RoBERTa / XLM-RoBERTa and achieve state-of-the-art performance in various tasks. Text is embedded in vector space such that similar text are closer and can efficiently be found using cosine similarity.
This package provides easy access to pretrained models for more than 100 languages, fine-tuned for various use-cases.
Further, this framework allows an easy fine-tuning of custom embeddings models, to achieve maximal performance on your specific task.
This package provides provides an implementation of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) for Python.
Low-precision, high-performance matrix-matrix multiplications and convolution library for server-side inference.
This package provides simple access speech to text for using in Linux without being tied to a desktop environment, using the vosk-api. The user configuration lets you manipulate text using Python string operations. It has zero overhead, as this relies on manual activation and there are no background processes. Dictation is accessed manually with nerd-dictation begin and nerd-dictation end commands.
This package implements a variety of persistent homology algorithms. It provides an interface for
computing persistence cohomology of sparse and dense data sets
visualizing persistence diagrams
computing lowerstar filtrations on images
computing representative cochains
This package provides a command line interface for Lightning AI services.
Lantern provides a C API to the libtorch machine learning library.
QNNPACK is a library for low-precision neural network inference. It contains the implementation of common neural network operators on quantized 8-bit tensors.
The torchvision package consists of popular datasets, model architectures, and common image transformations for computer vision.
NNPACK is an acceleration package for neural network computations. NNPACK aims to provide high-performance implementations of convnet layers for multi-core CPUs.
NNPACK is not intended to be directly used by machine learning researchers; instead it provides low-level performance primitives leveraged in leading deep learning frameworks, such as PyTorch, Caffe2, MXNet, tiny-dnn, Caffe, Torch, and Darknet.
BoTorch is a library for Bayesian Optimization built on PyTorch.
The GeomLoss library provides efficient GPU implementations for:
Kernel norms (also known as Maximum Mean Discrepancies).
Hausdorff divergences, which are positive definite generalizations of the Chamfer-ICP loss and are analogous to log-likelihoods of Gaussian Mixture Models.
Debiased Sinkhorn divergences, which are affordable yet positive and definite approximations of Optimal Transport (Wasserstein) distances.
TorchMetrics is a collection of 100+ PyTorch metrics implementations and an easy-to-use API to create custom metrics. It offers:
A standardized interface to increase reproducibility
Reduces boilerplate
Automatic accumulation over batches
Metrics optimized for distributed-training
Automatic synchronization between multiple devices
DMLC-Core is the backbone library to support all DMLC projects, offers the bricks to build efficient and scalable distributed machine learning libraries.
PyTorch Lightning is just organized PyTorch; Lightning disentangles PyTorch code to decouple the science from the engineering.
This package implements the Hopcroft-Karp algorithm, producing a maximum cardinality matching from a bipartite graph.
PyTorch is a Python package that provides two high-level features:
tensor computation (like NumPy) with strong GPU acceleration;
deep neural networks (DNNs) built on a tape-based autograd system.
You can reuse Python packages such as NumPy, SciPy, and Cython to extend PyTorch when needed.
Note: currently this package does not provide GPU support.
Lap is a linear assignment problem solver using Jonker-Volgenant algorithm for dense (LAPJV) or sparse (LAPMOD) matrices.
XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.
PyNNDescent provides a Python implementation of Nearest Neighbor Descent for k-neighbor-graph construction and approximate nearest neighbor search.