Enter the query into the form above. You can look for specific version of a package by using @ symbol like this: gcc@10.
API method:
GET /api/packages?search=hello&page=1&limit=20
where search is your query, page is a page number and limit is a number of items on a single page. Pagination information (such as a number of pages and etc) is returned
in response headers.
If you'd like to join our channel webring send a patch to ~whereiseveryone/toys@lists.sr.ht adding your channel as an entry in channels.scm.
Cloth Simulation Filter (CSF) is an airborne LiDAR (Light Detection and Ranging) ground points filtering algorithm which is based on cloth simulation. It tries to simulate the interactions between the cloth nodes and the corresponding LiDAR points, the locations of the cloth nodes can be determined to generate an approximation of the ground surface <https://www.mdpi.com/2072-4292/8/6/501/htm>.
This package provides functions to retrieve data and metadata from providers that disseminate data by means of SDMX web services. SDMX (Statistical Data and Metadata eXchange) is a standard that has been developed with the aim of simplifying the exchange of statistical information. More about the SDMX standard and the SDMX Web Services can be found at: <https://sdmx.org>.
The real-time quantitative polymerase chain reaction (qPCR) technical data sets by Ruijter et al. (2013) <doi:10.1016/j.ymeth.2012.08.011>: (i) the four-point 10-fold dilution series; (ii) 380 replicates; and (iii) the competimer data set. These three data sets can be used to benchmark qPCR methods. Original data set is available at <https://medischebiologie.nl/wp-content/uploads/2019/02/qpcrdatamethods.zip>. This package fixes incorrect annotations in the original data sets.
This package provides an easy way to report the results of regression analysis, including: 1. Proportional hazards regression from function coxph of package survival'; 2. Conditional logistic regression from function clogit of package survival'; 3. Ordered logistic regression from function polr of package MASS'; 4. Binary logistic regression from function glm of package stats'; 5. Linear regression from function lm of package stats'; 6. Risk regression model for survival analysis with competing risks from function FGR of package riskRegression'; 7. Multilevel model from function lme of package nlme'.
Multi-block data analysis concerns the analysis of several sets of variables (blocks) observed on the same group of individuals. The main aims of the RGCCA package are: to study the relationships between blocks and to identify subsets of variables of each block which are active in their relationships with the other blocks. This package allows to (i) run R/SGCCA and related methods, (ii) help the user to find out the optimal parameters for R/SGCCA such as regularization parameters (tau or sparsity), (iii) evaluate the stability of the RGCCA results and their significance, (iv) build predictive models from the R/SGCCA. (v) Generic print() and plot() functions apply to all these functionalities.
Linguistic Descriptions of Complex Phenomena (LDCP) is an architecture and methodology that allows us to model complex phenomena, interpreting input data, and generating automatic text reports customized to the user needs (see <doi:10.1016/j.ins.2016.11.002> and <doi:10.1007/s00500-016-2430-5>). The proposed package contains a set of methods that facilitates the development of LDCP systems. It main goal is increasing the visibility and practical use of this research line.
This package provides a simple WebDAV client that provides functions to fetch and send files or folders to servers using the WebDAV protocol (see RFC 4918 <https://www.rfc-editor.org/rfc/rfc4918>). Only a subset of the protocol is implemented (e.g. file locks are not yet supported).
Create densities, probabilities, random numbers, quantiles, and maximum likelihood estimation for several distributions, mainly the symmetric and asymmetric power exponential (AEP), a.k.a. the Subbottin family of distributions, also known as the generalized error distribution. Estimation is made using the design of Bottazzi (2004) <https://ideas.repec.org/p/ssa/lemwps/2004-14.html>, where the likelihood is maximized by several optimization procedures using the GNU Scientific Library (GSL)', translated to C++ code, which makes it both fast and accurate. The package also provides methods for the gamma, Laplace, and Asymmetric Laplace distributions.
Three methods to calculate R2 for models with correlated errors, including Phylogenetic GLS, Phylogenetic Logistic Regression, Linear Mixed Models (LMMs), and Generalized Linear Mixed Models (GLMMs). See details in Ives 2018 <doi:10.1093/sysbio/syy060>.
This package provides a platform-independent basic-statistics GUI (graphical user interface) for R, based on the tcltk package.
Non-linear transformations of data to better discover latent effects. Applies a sequence of three transformations (1) a Gaussianizing transformation, (2) a Z-score transformation, and (3) an outlier removal transformation. A publication describing the method has the following citation: Gregory J. Hunt, Mark A. Dane, James E. Korkola, Laura M. Heiser & Johann A. Gagnon-Bartsch (2020) "Automatic Transformation and Integration to Improve Visualization and Discovery of Latent Effects in Imaging Data", Journal of Computational and Graphical Statistics, <doi:10.1080/10618600.2020.1741379>.
Algorithms to price American and European equity options, convertible bonds and a variety of other financial derivatives. It uses an extension of the usual Black-Scholes model in which jump to default may occur at a probability specified by a power-law link between stock price and hazard rate as found in the paper by Takahashi, Kobayashi, and Nakagawa (2001) <doi:10.3905/jfi.2001.319302>. We use ideas and techniques from Andersen and Buffum (2002) <doi:10.2139/ssrn.355308> and Linetsky (2006) <doi:10.1111/j.1467-9965.2006.00271.x>.
The metrics() function calculates measures of scholarly impact. These include conventional measures, such as the number of publications and the total citations to all publications, as well as modern and robust metrics based on the vector of citations associated with each publication, such as the h index and many of its variants or rivals. These methods are described in Ruscio et al. (2012) <DOI: 10.1080/15366367.2012.711147>.
Model based simulation of dynamic networks under tie-oriented (Butts, C., 2008, <doi:10.1111/j.1467-9531.2008.00203.x>) and actor-oriented (Stadtfeld, C., & Block, P., 2017, <doi:10.15195/v4.a14>) relational event models. Supports simulation from a variety of relational event model extensions, including temporal variability in effects, heterogeneity through dyadic latent class relational event models (DLC-REM), random effects, blockmodels, and memory decay in relational event models (Lakdawala, R., 2024 <doi:10.48550/arXiv.2403.19329>). The development of this package was supported by a Vidi Grant (452-17-006) awarded by the Netherlands Organization for Scientific Research (NWO) Grant and an ERC Starting Grant (758791).
Data in multidimensional systems is obtained from operational systems and is transformed to adapt it to the new structure. Frequently, the operations to be performed aim to transform a flat table into a ROLAP (Relational On-Line Analytical Processing) star database. The main objective of the package is to allow the definition of these transformations easily. The implementation of the multidimensional database obtained can be exported to work with multidimensional analysis tools on spreadsheets or relational databases.
Allows the user to view an image in full screen when clicking on it in RMarkdown documents and shiny applications. The package relies on the JavaScript library intense-images'. See <https://tholman.com/intense-images/> for more information.
Computing singular value decomposition with robustness is a challenging task. This package provides an implementation of computing robust SVD using density power divergence (<doi:10.48550/arXiv.2109.10680>). It combines the idea of robustness and efficiency in estimation based on a tuning parameter. It also provides utility functions to simulate various scenarios to compare performances of different algorithms.
The Brazilian Central Bank API delivers many datasets which regard economic activity, regional economy, international economy, public finances, credit indicators and many more. For more information please see <http://dadosabertos.bcb.gov.br/>. These datasets can be accessed through rbcb functions and can be obtained in different data structures common to R ('tibble', data.frame', xts', ...).
Access data stored in REDCap databases using the Application Programming Interface (API). REDCap (Research Electronic Data CAPture; <https://projectredcap.org>, Harris, et al. (2009) <doi:10.1016/j.jbi.2008.08.010>, Harris, et al. (2019) <doi:10.1016/j.jbi.2019.103208>) is a web application for building and managing online surveys and databases developed at Vanderbilt University. The API allows users to access data and project meta data (such as the data dictionary) from the web programmatically. The redcapAPI package facilitates the process of accessing data with options to prepare an analysis-ready data set consistent with the definitions in a database's data dictionary.
An R API Client for Valve's Dota2. RDota2 can be easily used to connect to the Steam API and retrieve data for Valve's popular video game Dota2. You can find out more about Dota2 at <http://store.steampowered.com/app/570/>.
Estimates and plots as a heat map the rolling window wavelet correlation (RWWC) coefficients statistically significant (within the 95% CI) between two regular (evenly spaced) time series. RolWinWavCor also plots at the same graphic the time series under study. The RolWinWavCor was designed for financial time series, but this software can be used with other kinds of data (e.g., climatic, ecological, geological, etc). The functions contained in RolWinWavCor are highly flexible since these contains some parameters to personalize the time series under analysis and the heat maps of the rolling window wavelet correlation coefficients. Moreover, we have also included a data set (named EU_stock_markets) that contains nine European stock market indices to exemplify the use of the functions contained in RolWinWavCor'. Methods derived from Polanco-Martà nez et al (2018) <doi:10.1016/j.physa.2017.08.065>).
This package performs robust estimation and inference when using covariate adjustment and/or covariate-adaptive randomization in randomized controlled trials. This package is trimmed to reduce the dependencies and validated to be used across industry. See "FDA's final guidance on covariate adjustment"<https://www.regulations.gov/docket/FDA-2019-D-0934>, Tsiatis (2008) <doi:10.1002/sim.3113>, Bugni et al. (2018) <doi:10.1080/01621459.2017.1375934>, Ye, Shao, Yi, and Zhao (2023)<doi:10.1080/01621459.2022.2049278>, Ye, Shao, and Yi (2022)<doi:10.1093/biomet/asab015>, Rosenblum and van der Laan (2010)<doi:10.2202/1557-4679.1138>, Wang et al. (2021)<doi:10.1080/01621459.2021.1981338>, Ye, Bannick, Yi, and Shao (2023)<doi:10.1080/24754269.2023.2205802>, and Bannick, Shao, Liu, Du, Yi, and Ye (2024)<doi:10.48550/arXiv.2306.10213>.
This package provides subsets with reference semantics, i.e. subsets which automatically reflect changes in the original object, and which optionally update the original object when they are changed.
This package provides a collection of tools for measuring the similarity of text messages and tracing the flow of messages over time and across media.