Enter the query into the form above. You can look for specific version of a package by using @ symbol like this: gcc@10.
API method:
GET /api/packages?search=hello&page=1&limit=20
where search is your query, page is a page number and limit is a number of items on a single page. Pagination information (such as a number of pages and etc) is returned
in response headers.
If you'd like to join our channel webring send a patch to ~whereiseveryone/toys@lists.sr.ht adding your channel as an entry in channels.scm.
This package contains tools for working with and analyzing hospital readmissions data. The package provides utilities for components of the Hospital Readmissions Reduction Program (HRRP), including program timeline functions, Hospital-Specific Report (HSR) helpers, and general importing tools for the Provider Data Catalog (PDC).
It helps you to read (.dim) images with CRS directly into R programming. One can import both Sentinel 1 and 2 images or any processed data with this software.
This package provides a supportive collection of functions for gathering and plotting treatment ranking metrics after network meta-analysis.
Build regular expressions piece by piece using human readable code. This package contains date and time functionality, and is primarily intended to be used by package developers.
Recursive lists in the form of R objects, JSON', and XML', for use in teaching and examples. Examples include color palettes, Game of Thrones characters, GitHub users and repositories, music collections, and entities from the Star Wars universe. Data from the gapminder package is also included, as a simple data frame and in nested and split forms.
Unified object oriented interface for multiple independent streams of random numbers from different sources.
This package provides several non parametric randomness tests for numeric sequences.
Reads in sample description and slide description files and annotates the expression values taken from GenePix results files (text file format used by many microarray scanner and software providers). After normalization data can be visualized as boxplot, heatmap or dotplot.
Create tests and tasks compliant with the Question & Test Interoperability (QTI) information model version 2.1. Input sources are Rmd/md description files or S4-class objects. Output formats include standalone zip or xml files. Supports the generation of basic task types (single and multiple choice, order, pair association, matching tables, filling gaps and essay) and provides a comprehensive set of attributes for customizing tests.
Empirical orthogonal teleconnections in R. remote is short for R(-based) EMpirical Orthogonal TEleconnections'. It implements a collection of functions to facilitate empirical orthogonal teleconnection analysis. Empirical Orthogonal Teleconnections (EOTs) denote a regression based approach to decompose spatio-temporal fields into a set of independent orthogonal patterns. They are quite similar to Empirical Orthogonal Functions (EOFs) with EOTs producing less abstract results. In contrast to EOFs, which are orthogonal in both space and time, EOT analysis produces patterns that are orthogonal in either space or time.
Numerous functions for cohort-based analyses, either for prediction or causal inference. For causal inference, it includes Inverse Probability Weighting and G-computation for marginal estimation of an exposure effect when confounders are expected. We deal with binary outcomes, times-to-events, competing events, and multi-state data. For multistate data, semi-Markov model with interval censoring may be considered, and we propose the possibility to consider the excess of mortality related to the disease compared to reference lifetime tables. For predictive studies, we propose a set of functions to estimate time-dependent receiver operating characteristic (ROC) curves with the possible consideration of right-censoring times-to-events or the presence of confounders. Finally, several functions are available to assess time-dependent ROC curves or survival curves from aggregated data.
Generates disease-specific drug-response profiles that are independent of time, concentration, and cell-line. Based on the cell lines used as surrogates, the returned profiles represent the unique transcriptional changes induced by a compound in a given disease.
Converting ascii text into (floating-point) numeric values is a very common problem. The fast_float header-only C++ library by Daniel Lemire does it very well and very fast at up to or over to 1 gigabyte per second as described in more detail in <doi:10.1002/spe.2984>. fast_float is licensed under the Apache 2.0 license and provided here for use by other R packages via a simple LinkingTo: statement.
PaleoClim <http://www.paleoclim.org> (Brown et al. 2019, <doi:10.1038/sdata.2018.254>) is a set of free, high resolution paleoclimate surfaces covering the whole globe. It includes data on surface temperature, precipitation and the standard bioclimatic variables commonly used in ecological modelling, derived from the HadCM3 general circulation model and downscaled to a spatial resolution of up to 2.5 minutes. Simulations are available for key time periods from the Late Holocene to mid-Pliocene. Data on current and Last Glacial Maximum climate is derived from CHELSA (Karger et al. 2017, <doi:10.1038/sdata.2017.122>) and reprocessed by PaleoClim to match their format; it is available at up to 30 seconds resolution. This package provides a simple interface for downloading PaleoClim data in R, with support for caching and filtering retrieved data by period, resolution, and geographic extent.
Integrated tools to support rigorous and well documented data harmonization based on Maelstrom Research guidelines. The package includes functions to assess and prepare input elements, apply specified processing rules to generate harmonized datasets, validate data processing and identify processing errors, and document and summarize harmonized outputs. The harmonization process is defined and structured by two key user-generated documents: the DataSchema (specifying the list of harmonized variables to generate across datasets) and the Data Processing Elements (specifying the input elements and processing algorithms to generate harmonized variables in DataSchema formats). The package was developed to address key challenges of retrospective data harmonization in epidemiology (as described in Fortier I and al. (2017) <doi:10.1093/ije/dyw075>) but can be used for any data harmonization initiative.
Building interactive web applications with R is incredibly easy with shiny'. Behind the scenes, shiny builds a reactive graph that can quickly become intertwined and difficult to debug. reactlog (Schloerke 2019) <doi:10.5281/zenodo.2591517> provides a visual insight into that black box of shiny reactivity by constructing a directed dependency graph of the application's reactive state at any time point in a reactive recording.
Data with irregular spatial support, such as runoff related data or data from administrative units, can with rtop be interpolated to locations without observations with the top-kriging method. A description of the package is given by Skøien et al (2014) <doi:10.1016/j.cageo.2014.02.009>.
Set of functions for Regression Discontinuity Design ('RDD'), for data visualisation, estimation and testing.
Reference-based multiple imputation of ordinal and binary responses under Bayesian framework, as described in Wang and Liu (2022) <arXiv:2203.02771>. Methods for missing-not-at-random include Jump-to-Reference (J2R), Copy Reference (CR), and Delta Adjustment which can generate tipping point analysis.
Play the classic game of tic-tac-toe (naughts and crosses).
An implementation of robust bent line regression. It can fit the bent line regression and test the existence of change point, for the paper, "Feipeng Zhang and Qunhua Li (2016). Robust bent line regression, submitted.".
This package provides a novel numerical algorithm that provides functionality for estimating the exact 95% confidence interval of the location parameter in the random effects model, and is much faster than the naive method. Works best when the number of studies is between 6-20.
Compute the repeated measures correlation, a statistical technique for determining the overall within-individual relationship among paired measures assessed on two or more occasions, first introduced by Bland and Altman (1995). Includes functions for diagnostics, p-value, effect size with confidence interval including optional bootstrapping, as well as graphing. Also includes several example datasets. For more details, see the web documentation <https://lmarusich.github.io/rmcorr/index.html> and the original paper: Bakdash and Marusich (2017) <doi:10.3389/fpsyg.2017.00456>.
Utilities for sparse signal recovery suitable for compressed sensing. L1, L2 and TV penalties, DFT basis matrix, simple sparse signal generator, mutual cumulative coherence between two matrices and examples, Lp complex norm, scaling back regression coefficients.