Enter the query into the form above. You can look for specific version of a package by using @ symbol like this: gcc@10.
API method:
GET /api/packages?search=hello&page=1&limit=20
where search is your query, page is a page number and limit is a number of items on a single page. Pagination information (such as a number of pages and etc) is returned
in response headers.
If you'd like to join our channel webring send a patch to ~whereiseveryone/toys@lists.sr.ht adding your channel as an entry in channels.scm.
TensorPipe provides a tensor-aware channel to transfer rich objects from one process to another while using the fastest transport for the tensors contained therein.
This package provides OCaml bindings for the MCL graph clustering algorithm.
Low-precision, high-performance matrix-matrix multiplications and convolution library for server-side inference.
QNNPACK is a library for low-precision neural network inference. It contains the implementation of common neural network operators on quantized 8-bit tensors.
This package provides a Python library to easily read single characters and key strokes.
Apache TVM is a compiler stack for deep learning systems. It is designed to close the gap between the productivity-focused deep learning frameworks, and the performance- and efficiency-focused hardware backends. TVM works with deep learning frameworks to provide end to end compilation to different backends
Scikit-rebate is a scikit-learn-compatible Python implementation of ReBATE, a suite of Relief-based feature selection algorithms for Machine Learning. These algorithms excel at identifying features that are predictive of the outcome in supervised learning problems, and are especially good at identifying feature interactions that are normally overlooked by standard feature selection algorithms.
This package provides a command line interface for Lightning AI services.
This package enables you to deserialize Lua torch-serialized objects from Python.
This package provides a Python wrapper for the SentencePiece unsupervised text tokenizer.
PyTorch is a Python package that provides two high-level features:
tensor computation (like NumPy) with strong GPU acceleration;
deep neural networks (DNNs) built on a tape-based autograd system.
You can reuse Python packages such as NumPy, SciPy, and Cython to extend PyTorch when needed.
Note: currently this package does not provide GPU support.
Brian is a simulator for spiking neural networks written in Python. It is therefore designed to be easy to learn and use, highly flexible and easily extensible.
ONNX is a format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types.
Random Jungle is an implementation of Random Forests. It is supposed to analyse high dimensional data. In genetics, it can be used for analysing big Genome Wide Association (GWA) data. Random Forests is a powerful machine learning method. Most interesting features are variable selection, missing value imputation, classifier creation, generalization error estimation and sample proximities between pairs of cases.
This library is used internally as header-only library by PyTorch.
This is a real-time full-duplex speech recognition server, based on the Kaldi toolkit and the GStreamer framework and implemented in Python.
SentencePiece is an unsupervised text tokenizer and detokenizer mainly for Neural Network-based text generation systems where the vocabulary size is predetermined prior to the neural model training. SentencePiece implements subword units---e.g., byte-pair-encoding (BPE) and unigram language model---with the extension of direct training from raw sentences. SentencePiece allows us to make a purely end-to-end system that does not depend on language-specific pre- or post-processing.
HDBSCAN - Hierarchical Density-Based Spatial Clustering of Applications with Noise. Performs DBSCAN over varying epsilon values and integrates the result to find a clustering that gives the best stability over epsilon. This allows HDBSCAN to find clusters of varying densities (unlike DBSCAN), and be more robust to parameter selection. HDBSCAN is ideal for exploratory data analysis; it's a fast and robust algorithm that you can trust to return meaningful clusters (if there are any).
This package implements the Hopcroft-Karp algorithm, producing a maximum cardinality matching from a bipartite graph.
ONNX is a format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types.
OpenFst is a library for constructing, combining, optimizing, and searching weighted finite-state transducers (FSTs).
PyG is a library built upon PyTorch to easily write and train Graph Neural Networks for a wide range of applications related to structured data.
Hmmlearn is a set of algorithms for unsupervised learning and inference of Hidden Markov Models.
This package provides Autograd-compatible approximations to the gamma family of functions.