This is an R package for spell checking common document formats including LaTeX, markdown, manual pages, and DESCRIPTION files. It includes utilities to automate checking of documentation and vignettes as a unit test during R CMD check
. Both British and American English are supported out of the box and other languages can be added. In addition, packages may define a wordlist to allow custom terminology without having to abuse punctuation.
Winit is a window creation and management library. It can create windows and lets you handle events (for example: the window being resized, a key being pressed, a mouse movement, etc.) produced by window.
Winit is designed to be a low-level brick in a hierarchy of libraries. Consequently, in order to show something on the window you need to use the platform-specific getters provided by winit, or another library.
Winit is a window creation and management library. It can create windows and lets you handle events (for example: the window being resized, a key being pressed, a mouse movement, etc.) produced by window.
Winit is designed to be a low-level brick in a hierarchy of libraries. Consequently, in order to show something on the window you need to use the platform-specific getters provided by winit, or another library.
Winit is a window creation and management library. It can create windows and lets you handle events (for example: the window being resized, a key being pressed, a mouse movement, etc.) produced by window.
Winit is designed to be a low-level brick in a hierarchy of libraries. Consequently, in order to show something on the window you need to use the platform-specific getters provided by winit, or another library.
Winit is a window creation and management library. It can create windows and lets you handle events (for example: the window being resized, a key being pressed, a mouse movement, etc.) produced by window.
Winit is designed to be a low-level brick in a hierarchy of libraries. Consequently, in order to show something on the window you need to use the platform-specific getters provided by winit, or another library.
Winit is a window creation and management library. It can create windows and lets you handle events (for example: the window being resized, a key being pressed, a mouse movement, etc.) produced by window.
Winit is designed to be a low-level brick in a hierarchy of libraries. Consequently, in order to show something on the window you need to use the platform-specific getters provided by winit, or another library.
Winit is a window creation and management library. It can create windows and lets you handle events (for example: the window being resized, a key being pressed, a mouse movement, etc.) produced by window.
Winit is designed to be a low-level brick in a hierarchy of libraries. Consequently, in order to show something on the window you need to use the platform-specific getters provided by winit, or another library.
Winit is a window creation and management library. It can create windows and lets you handle events (for example: the window being resized, a key being pressed, a mouse movement, etc.) produced by window.
Winit is designed to be a low-level brick in a hierarchy of libraries. Consequently, in order to show something on the window you need to use the platform-specific getters provided by winit, or another library.
This Rust crate implements a file system walk that runs in parallel using rayon
. It attempts to combine the parallelism of ignore
with walkdir
's streaming iterator API. Entries are streamed in sorted order with options for custom sorting, filtering, and skipping.
Directory traversal is already pretty fast. If you don't need this crate's speed then walkdir provides a smaller and more tested single threaded implementation.
This package provides a lightweight but powerful R interface to the Azure Resource Manager REST API. The package exposes a comprehensive class framework and related tools for creating, updating and deleting Azure resource groups, resources and templates. While AzureRMR
can be used to manage any Azure service, it can also be extended by other packages to provide extra functionality for specific services. Part of the AzureR
family of packages.
Implementation of the bunching estimator for kinks and notches. Allows for flexible estimation of counterfactual (e.g. controlling for round number bunching, accounting for other bunching masses within bunching window, fixing bunching point to be minimum, maximum or median value in its bin, etc.). It produces publication-ready plots in the style followed since Chetty et al. (2011) <doi:10.1093/qje/qjr013>, with lots of functionality to set plot options.
Fit Bayesian models using brms'/'Stan with parsnip'/'tidymodels via bayesian <doi:10.5281/zenodo.4426836>. tidymodels is a collection of packages for machine learning; see Kuhn and Wickham (2020) <https://www.tidymodels.org>). The technical details of brms and Stan are described in Bürkner (2017) <doi:10.18637/jss.v080.i01>, Bürkner (2018) <doi:10.32614/RJ-2018-017>, and Carpenter et al. (2017) <doi:10.18637/jss.v076.i01>.
This package provides functions to align curves and to compute mean curves based on the elastic distance defined in the square-root-velocity framework. For more details on this framework see Srivastava and Klassen (2016, <doi:10.1007/978-1-4939-4020-2>). For more theoretical details on our methods and algorithms see Steyer et al. (2023, <doi:10.1111/biom.13706>) and Steyer et al. (2023, <arXiv:2305.02075>
).
Estimates and provides inference for quantities that assess high dimensional mediation and potential surrogate markers including the direct effect of treatment, indirect effect of treatment, and the proportion of treatment effect explained by a surrogate/mediator; details are described in Zhou et al (2022) <doi:10.1002/sim.9352> and Zhou et al (2020) <doi:10.1093/biomet/asaa016>. This package relies on the optimization software MOSEK', <https://www.mosek.com>.
The genridge package introduces generalizations of the standard univariate ridge trace plot used in ridge regression and related methods. These graphical methods show both bias (actually, shrinkage) and precision, by plotting the covariance ellipsoids of the estimated coefficients, rather than just the estimates themselves. 2D and 3D plotting methods are provided, both in the space of the predictor variables and in the transformed space of the PCA/SVD of the predictors.
This algorithm is described in detail in the paper "Hedging Forecast Combinations With an Application to the Random Forest" by Beck et al. (2024) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5032102>. The package provides a function hedgedrf()
that can be used to train a Hedged Random Forest model on a dataset, and a function predict.hedgedrf()
that can be used to make predictions with the model.
The app will calculate the ICER (incremental cost-effectiveness ratio) Rawlins (2012) <doi:10.1016/B978-0-7020-4084-9.00044-6> from the mean costs and quality-adjusted life years (QALY) Torrance and Feeny (2009) <doi:10.1017/S0266462300008461> for a set of treatment options, and draw the efficiency frontier in the costs-effectiveness plane. The app automatically identifies and excludes dominated and extended-dominated options from the ICER calculation.
The wiDB...()
functions provide an interface to the public API of the wiDB
<https://github.com/SPATIAL-Lab/isoWater/blob/master/Protocol.md>
: build, check and submit queries, and receive and unpack responses. Data analysis functions support Bayesian inference of the source and source isotope composition of water samples that may have experienced evaporation. Algorithms adapted from Bowen et al. (2018, <doi:10.1007/s00442-018-4192-5>).
Variational Expectation-Maximization algorithm to fit the noisy stochastic block model to an observed dense graph and to perform a node clustering. Moreover, a graph inference procedure to recover the underlying binary graph. This procedure comes with a control of the false discovery rate. The method is described in the article "Powerful graph inference with false discovery rate control" by T. Rebafka, E. Roquain, F. Villers (2020) <arXiv:1907.10176>
.
This package provides functions to compute split generalized linear models. The approach fits generalized linear models that split the covariates into groups. The optimal split of the variables into groups and the regularized estimation of the coefficients are performed by minimizing an objective function that encourages sparsity within each group and diversity among them. Example applications can be found in Christidis et al. (2021) <doi:10.48550/arXiv.2102.08591>
.
This package provides a Package for selecting variables for the joint modeling of mean and dispersion (including models for mixture experiments) based on hypothesis testing and the quality of model's fit. In each iteration of the selection process, a criterion for checking the goodness of fit is used as a filter for choosing the terms that will be evaluated by a hypothesis test. Pinto & Pereira (2021) <arXiv:2109.07978>
.
Computes the test statistics for examining the significance of autocorrelation in univariate time series, cross-correlation in bivariate time series, Pearson correlations in multivariate series and test statistics for i.i.d. property of univariate series given in Dalla, Giraitis and Phillips (2022), <https://www.cambridge.org/core/journals/econometric-theory/article/abs/robust-tests-for-white-noise-and-crosscorrelation/4D77C12C52433F4C6735E584C779403A>, <https://elischolar.library.yale.edu/cowles-discussion-paper-series/57/>.
Complete work flow for the analysis of pharmacokinetic pharmacodynamic (PKPD), physiologically-based pharmacokinetic (PBPK) and systems pharmacology models including: creation of ordinary differential equation-based models, pooled parameter estimation, individual/population based simulations, rule-based simulations for clinical trial design and modeling assays, deployment with a customizable Shiny app, and non-compartmental analysis. System-specific analysis templates can be generated and each element includes integrated reporting with PowerPoint
and Word'.
Perform the analysis of the World Health Organization (WHO) Pharmacovigilance database VigiBase
(Extract Case Level version), <https://who-umc.org/> e.g., load data, perform data management, disproportionality analysis, and descriptive statistics. Intended for pharmacovigilance routine use or studies. This package is NOT supported nor reflect the opinion of the WHO, or the Uppsala Monitoring Centre. Disproportionality methods are described by Norén et al (2013) <doi:10.1177/0962280211403604>.