Enter the query into the form above. You can look for specific version of a package by using @ symbol like this: gcc@10.
API method:
GET /api/packages?search=hello&page=1&limit=20
where search is your query, page is a page number and limit is a number of items on a single page. Pagination information (such as a number of pages and etc) is returned
in response headers.
If you'd like to join our channel webring send a patch to ~whereiseveryone/toys@lists.sr.ht adding your channel as an entry in channels.scm.
Scikit-learn provides simple and efficient tools for data mining and data analysis.
This Python library provides several solvers for optimization problems related to Optimal Transport for signal, image processing and machine learning.
Autograd can automatically differentiate native Python and NumPy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization.
This package provides simple access speech to text for using in Linux without being tied to a desktop environment, using the vosk-api. The user configuration lets you manipulate text using Python string operations. It has zero overhead, as this relies on manual activation and there are no background processes. Dictation is accessed manually with nerd-dictation begin and nerd-dictation end commands.
OpenFst is a library for constructing, combining, optimizing, and searching weighted finite-state transducers (FSTs).
This is a package for hassle-free computation of shareable, comparable, and reproducible BLEU, chrF, and TER scores for natural language processing.
This package provides OCaml bindings for the MCL graph clustering algorithm.
This library is used internally as header-only library by PyTorch.
Hmmlearn is a set of algorithms for unsupervised learning and inference of Hidden Markov Models.
This package provides a Python library for probabilistic modeling and inference.
This package provides logging utilities for the SpaCy natural language processing framework.
ONNX is a format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types.
This package provides simple access speech to text for using in Linux without being tied to a desktop environment, using the vosk-api. The user configuration lets you manipulate text using Python string operations. It has zero overhead, as this relies on manual activation and there are no background processes. Dictation is accessed manually with nerd-dictation begin and nerd-dictation end commands.
This package implements a variety of persistent homology algorithms. It provides an interface for
computing persistence cohomology of sparse and dense data sets
visualizing persistence diagrams
computing lowerstar filtrations on images
computing representative cochains
Gloo is a collective communications library. It comes with a number of collective algorithms useful for machine learning applications. These include a barrier, broadcast, and allreduce.
This package provides Autograd-compatible approximations to the gamma family of functions.
CTranslate2 is a C++ and Python library for efficient inference with Transformer models.
The project implements a custom runtime that applies many performance optimization techniques such as weights quantization, layers fusion, batch reordering, etc., to accelerate and reduce the memory usage of Transformer models on CPU and GPU.
This package contains legacy registered functions for spaCy backwards compatibility.
SentencePiece is an unsupervised text tokenizer and detokenizer mainly for Neural Network-based text generation systems where the vocabulary size is predetermined prior to the neural model training. SentencePiece implements subword units---e.g., byte-pair-encoding (BPE) and unigram language model---with the extension of direct training from raw sentences. SentencePiece allows us to make a purely end-to-end system that does not depend on language-specific pre- or post-processing.
This package provides an implementation of today’s most used tokenizers, with a focus on performance and versatility.
QNNPACK is a library for low-precision neural network inference. It contains the implementation of common neural network operators on quantized 8-bit tensors.
This package provides simple access speech to text for using in Linux without being tied to a desktop environment, using the vosk-api. The user configuration lets you manipulate text using Python string operations. It has zero overhead, as this relies on manual activation and there are no background processes. Dictation is accessed manually with nerd-dictation begin and nerd-dictation end commands.
Brian is a simulator for spiking neural networks written in Python. It is therefore designed to be easy to learn and use, highly flexible and easily extensible.
Gloo is a collective communications library. It comes with a number of collective algorithms useful for machine learning applications. These include a barrier, broadcast, and allreduce.