Writing R Extensions

Table of Contents

Next:   [Contents][Index]

Writing R Extensions

This is a guide to extending R, describing the process of creating R add-on packages, writing R documentation, R’s system and foreign language interfaces, and the R API.

This manual is for R, version 3.5.0 Under development (2017-12-12).

Copyright © 1999–2017 R Core Team

Permission is granted to make and distribute verbatim copies of this manual provided the copyright notice and this permission notice are preserved on all copies.

Permission is granted to copy and distribute modified versions of this manual under the conditions for verbatim copying, provided that the entire resulting derived work is distributed under the terms of a permission notice identical to this one.

Permission is granted to copy and distribute translations of this manual into another language, under the above conditions for modified versions, except that this permission notice may be stated in a translation approved by the R Core Team.


Next: , Previous: , Up: Top   [Contents][Index]

Acknowledgements

The contributions to early versions of this manual by Saikat DebRoy (who wrote the first draft of a guide to using .Call and .External) and Adrian Trapletti (who provided information on the C++ interface) are gratefully acknowledged.


Next: , Previous: , Up: Top   [Contents][Index]

1 Creating R packages

Packages provide a mechanism for loading optional code, data and documentation as needed. The R distribution itself includes about 30 packages.

In the following, we assume that you know the library() command, including its lib.loc argument, and we also assume basic knowledge of the R CMD INSTALL utility. Otherwise, please look at R’s help pages on

?library
?INSTALL

before reading on.

For packages which contain code to be compiled, a computing environment including a number of tools is assumed; the “R Installation and Administration” manual describes what is needed for each OS.

Once a source package is created, it must be installed by the command R CMD INSTALL. See Add-on-packages in R Installation and Administration.

Other types of extensions are supported (but rare): See Package types.

Some notes on terminology complete this introduction. These will help with the reading of this manual, and also in describing concepts accurately when asking for help.

A package is a directory of files which extend R, a source package (the master files of a package), or a tarball containing the files of a source package, or an installed package, the result of running R CMD INSTALL on a source package. On some platforms (notably macOS and Windows) there are also binary packages, a zip file or tarball containing the files of an installed package which can be unpacked rather than installing from sources.

A package is not1 a library. The latter is used in two senses in R documentation.

There are a number of well-defined operations on source packages.

The concept of lazy loading of code or data is mentioned at several points. This is part of the installation, always selected for R code but optional for data. When used the R objects of the package are created at installation time and stored in a database in the R directory of the installed package, being loaded into the session at first use. This makes the R session start up faster and use less (virtual) memory. (For technical details, see Lazy loading in R Internals.)

CRAN is a network of WWW sites holding the R distributions and contributed code, especially R packages. Users of R are encouraged to join in the collaborative project and to submit their own packages to CRAN: current instructions are linked from https://CRAN.R-project.org/banner.shtml#submitting.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.1 Package structure

The sources of an R package consists of a subdirectory containing a files DESCRIPTION and NAMESPACE, and the subdirectories R, data, demo, exec, inst, man, po, src, tests, tools and vignettes (some of which can be missing, but which should not be empty). The package subdirectory may also contain files INDEX, configure, cleanup, LICENSE, LICENCE and NEWS. Other files such as INSTALL (for non-standard installation instructions), README/README.md2, or ChangeLog will be ignored by R, but may be useful to end users. The utility R CMD build may add files in a build directory (but this should not be used for other purposes).

Except where specifically mentioned,3 packages should not contain Unix-style ‘hidden’ files/directories (that is, those whose name starts with a dot).

The DESCRIPTION and INDEX files are described in the subsections below. The NAMESPACE file is described in the section on Package namespaces.

The optional files configure and cleanup are (Bourne) shell scripts which are, respectively, executed before and (if option --clean was given) after installation on Unix-alikes, see Configure and cleanup. The analogues on Windows are configure.win and cleanup.win.

For the conventions for files NEWS and ChangeLog in the GNU project see https://www.gnu.org/prep/standards/standards.html#Documentation.

The package subdirectory should be given the same name as the package. Because some file systems (e.g., those on Windows and by default on OS X) are not case-sensitive, to maintain portability it is strongly recommended that case distinctions not be used to distinguish different packages. For example, if you have a package named foo, do not also create a package named Foo.

To ensure that file names are valid across file systems and supported operating systems, the ASCII control characters as well as the characters ‘"’, ‘*’, ‘:’, ‘/’, ‘<’, ‘>’, ‘?’, ‘\’, and ‘|’ are not allowed in file names. In addition, files with names ‘con’, ‘prn’, ‘aux’, ‘clock$’, ‘nul’, ‘com1’ to ‘com9’, and ‘lpt1’ to ‘lpt9’ after conversion to lower case and stripping possible “extensions” (e.g., ‘lpt5.foo.bar’), are disallowed. Also, file names in the same directory must not differ only by case (see the previous paragraph). In addition, the basenames of ‘.Rd’ files may be used in URLs and so must be ASCII and not contain %. For maximal portability filenames should only contain only ASCII characters not excluded already (that is A-Za-z0-9._!#$%&+,;=@^(){}'[] — we exclude space as many utilities do not accept spaces in file paths): non-English alphabetic characters cannot be guaranteed to be supported in all locales. It would be good practice to avoid the shell metacharacters (){}'[]$~: ~ is also used as part of ‘8.3’ filenames on Windows. In addition, packages are normally distributed as tarballs, and these have a limit on path lengths: for maximal portability 100 bytes.

A source package if possible should not contain binary executable files: they are not portable, and a security risk if they are of the appropriate architecture. R CMD check will warn about them4 unless they are listed (one filepath per line) in a file BinaryFiles at the top level of the package. Note that CRAN will not accept submissions containing binary files even if they are listed.

The R function package.skeleton can help to create the structure for a new package: see its help page for details.


Next: , Previous: , Up: Package structure   [Contents][Index]

1.1.1 The DESCRIPTION file

The DESCRIPTION file contains basic information about the package in the following format:

Package: pkgname
Version: 0.5-1
Date: 2015-01-01
Title: My First Collection of Functions
Authors@R: c(person("Joe", "Developer", role = c("aut", "cre"),
                     email = "Joe.Developer@some.domain.net"),
              person("Pat", "Developer", role = "aut"),
              person("A.", "User", role = "ctb",
                     email = "A.User@whereever.net"))
Author: Joe Developer [aut, cre],
  Pat Developer [aut],
  A. User [ctb]
Maintainer: Joe Developer <Joe.Developer@some.domain.net>
Depends: R (>= 3.1.0), nlme
Suggests: MASS
Description: A (one paragraph) description of what
  the package does and why it may be useful.
License: GPL (>= 2)
URL: https://www.r-project.org, http://www.another.url
BugReports: https://pkgname.bugtracker.url

The format is that of a version of a ‘Debian Control File’ (see the help for ‘read.dcf’ and https://www.debian.org/doc/debian-policy/index.html#document-ch-controlfields: R does not require encoding in UTF-8 and does not support comments starting with ‘#’). Fields start with an ASCII name immediately followed by a colon: the value starts after the colon and a space. Continuation lines (for example, for descriptions longer than one line) start with a space or tab. Field names are case-sensitive: all those used by R are capitalized.

For maximal portability, the DESCRIPTION file should be written entirely in ASCII — if this is not possible it must contain an ‘Encoding’ field (see below).

Several optional fields take logical values: these can be specified as ‘yes’, ‘true’, ‘no’ or ‘false’: capitalized values are also accepted.

The ‘Package’, ‘Version’, ‘License’, ‘Description’, ‘Title’, ‘Author’, and ‘Maintainer’ fields are mandatory, all other fields are optional. Fields ‘Author’ and ‘Maintainer’ can be auto-generated from ‘Authors@R’, and may be omitted if the latter is provided: however if they are not ASCII we recommend that they are provided.

The mandatory ‘Package’ field gives the name of the package. This should contain only (ASCII) letters, numbers and dot, have at least two characters and start with a letter and not end in a dot. If it needs explaining, this should be done in the ‘Description’ field (and not the ‘Title’ field).

The mandatory ‘Version’ field gives the version of the package. This is a sequence of at least two (and usually three) non-negative integers separated by single ‘.’ or ‘-’ characters. The canonical form is as shown in the example, and a version such as ‘0.01’ or ‘0.01.0’ will be handled as if it were ‘0.1-0’. It is not a decimal number, so for example 0.9 < 0.75 since 9 < 75.

The mandatory ‘License’ field is discussed in the next subsection.

The mandatory ‘Title’ field should give a short description of the package. Some package listings may truncate the title to 65 characters. It should use title case (that is, use capitals for the principal words: tools::toTitleCase can help you with this), not use any markup, not have any continuation lines, and not end in a period (unless part of …). Do not repeat the package name: it is often used prefixed by the name. Refer to other packages and external software in single quotes, and to book titles (and similar) in double quotes.

The mandatory ‘Description’ field should give a comprehensive description of what the package does. One can use several (complete) sentences, but only one paragraph. It should be intelligible to all the intended readership (e.g. for a CRAN package to all CRAN users). It is good practice not to start with the package name, ‘This package’ or similar. As with the ‘Title’ field, double quotes should be used for quotations (including titles of books and articles), and single quotes for non-English usage, including names of other packages and external software. This field should also be used for explaining the package name if necessary. URLs should be enclosed in angle brackets, e.g. ‘<https://www.r-project.org>’: see also Specifying URLs.

The mandatory ‘Author’ field describes who wrote the package. It is a plain text field intended for human readers, but not for automatic processing (such as extracting the email addresses of all listed contributors: for that use ‘Authors@R’). Note that all significant contributors must be included: if you wrote an R wrapper for the work of others included in the src directory, you are not the sole (and maybe not even the main) author.

The mandatory ‘Maintainer’ field should give a single name followed by a valid (RFC 2822) email address in angle brackets. It should not end in a period or comma. This field is what is reported by the maintainer function and used by bug.report. For a CRAN package it should be a person, not a mailing list and not a corporate entity: do ensure that it is valid and will remain valid for the lifetime of the package.

Note that the display name (the part before the address in angle brackets) should be enclosed in double quotes if it contains non-alphanumeric characters such as comma or period. (The current standard, RFC 5322, allows periods but RFC 2822 did not.)

Both ‘Author’ and ‘Maintainer’ fields can be omitted if a suitable ‘Authors@R’ field is given. This field can be used to provide a refined and machine-readable description of the package “authors” (in particular specifying their precise roles), via suitable R code. It should create an object of class "person", by either a call to person or a series of calls (one per “author”) concatenated by c(): see the example DESCRIPTION file above. The roles can include ‘"aut"’ (author) for full authors, ‘"cre"’ (creator) for the package maintainer, and ‘"ctb"’ (contributor) for other contributors, ‘"cph"’ (copyright holder), among others. See ?person for more information. Note that no role is assumed by default. Auto-generated package citation information takes advantage of this specification. The ‘Author’ and ‘Maintainer’ fields are auto-generated from it if needed when building5 or installing.

An optional ‘Copyright’ field can be used where the copyright holder(s) are not the authors. If necessary, this can refer to an installed file: the convention is to use file inst/COPYRIGHTS.

The optional ‘Date’ field gives the release date of the current version of the package. It is strongly recommended6 to use the ‘yyyy-mm-dd’ format conforming to the ISO 8601 standard.

The ‘Depends’, ‘Imports’, ‘Suggests’, ‘Enhances’, ‘LinkingTo’ and ‘Additional_repositories’ fields are discussed in a later subsection.

Dependencies external to the R system should be listed in the ‘SystemRequirements’ field, possibly amplified in a separate README file.

The ‘URL’ field may give a list of URLs separated by commas or whitespace, for example the homepage of the author or a page where additional material describing the software can be found. These URLs are converted to active hyperlinks in CRAN package listings. See Specifying URLs.

The ‘BugReports’ field may contain a single URL to which bug reports about the package should be submitted. This URL will be used by bug.report instead of sending an email to the maintainer. A browser is opened for a ‘http://’ or ‘https://URL. As from R 3.4.0, bug.report will try to extract an email address (preferably from a ‘mailto:’ URL or enclosed in angle brackets).

Base and recommended packages (i.e., packages contained in the R source distribution or available from CRAN and recommended to be included in every binary distribution of R) have a ‘Priority’ field with value ‘base’ or ‘recommended’, respectively. These priorities must not be used by other packages.

A ‘Collate’ field can be used for controlling the collation order for the R code files in a package when these are processed for package installation. The default is to collate according to the ‘C’ locale. If present, the collate specification must list all R code files in the package (taking possible OS-specific subdirectories into account, see Package subdirectories) as a whitespace separated list of file paths relative to the R subdirectory. Paths containing white space or quotes need to be quoted. An OS-specific collation field (‘Collate.unix’ or ‘Collate.windows’) will be used in preference to ‘Collate’.

The ‘LazyData’ logical field controls whether the R datasets use lazy-loading. A ‘LazyLoad’ field was used in versions prior to 2.14.0, but now is ignored.

The ‘KeepSource’ logical field controls if the package code is sourced using keep.source = TRUE or FALSE: it might be needed exceptionally for a package designed to always be used with keep.source = TRUE.

The ‘ByteCompile’ logical field controls if the package code is to be byte-compiled on installation: the default is currently not to, so this may be useful for a package known to benefit particularly from byte-compilation (which can take quite a long time and increases the installed size of the package). It is used for the recommended packages, as they are byte-compiled when R is installed and for consistency should be byte-compiled when updated. This can be overridden by installing with flag --no-byte-compile.

The ‘ZipData’ logical field was used to control whether the automatic Windows build would zip up the data directory or not prior to R 2.13.0: it is now ignored.

The ‘Biarch’ logical field is used on Windows to select the INSTALL option --force-biarch for this package.

The ‘BuildVignettes’ logical field can be set to a false value to stop R CMD build from attempting to build the vignettes, as well as preventing7 R CMD check from testing this. This should only be used exceptionally, for example if the PDFs include large figures which are not part of the package sources (and hence only in packages which do not have an Open Source license).

The ‘VignetteBuilder’ field names (in a comma-separated list) packages that provide an engine for building vignettes. These may include the current package, or ones listed in ‘Depends’, ‘Suggests’ or ‘Imports’. The utils package is always implicitly appended. See Non-Sweave vignettes for details.

If the DESCRIPTION file is not entirely in ASCII it should contain an ‘Encoding’ field specifying an encoding. This is used as the encoding of the DESCRIPTION file itself and of the R and NAMESPACE files, and as the default encoding of .Rd files. The examples are assumed to be in this encoding when running R CMD check, and it is used for the encoding of the CITATION file. Only encoding names latin1, latin2 and UTF-8 are known to be portable. (Do not specify an encoding unless one is actually needed: doing so makes the package less portable. If a package has a specified encoding, you should run R CMD build etc in a locale using that encoding.)

The ‘NeedsCompilation’ field should be set to "yes" if the package contains code which to be compiled, otherwise "no" (when the package could be installed from source on any platform without additional tools). This is used by install.packages(type = "both") in R >= 2.15.2 on platforms where binary packages are the norm: it is normally set by R CMD build or the repository assuming compilation is required if and only if the package has a src directory.

The ‘OS_type’ field specifies the OS(es) for which the package is intended. If present, it should be one of unix or windows, and indicates that the package can only be installed on a platform with ‘.Platform$OS.type’ having that value.

The ‘Type’ field specifies the type of the package: see Package types.

One can add subject classifications for the content of the package using the fields ‘Classification/ACM’ or ‘Classification/ACM-2012’ (using the Computing Classification System of the Association for Computing Machinery, http://www.acm.org/about/class/; the former refers to the 1998 version), ‘Classification/JEL’ (the Journal of Economic Literature Classification System, https://www.aeaweb.org/econlit/jelCodes.php, or ‘Classification/MSC’ or ‘Classification/MSC-2010’ (the Mathematics Subject Classification of the American Mathematical Society, http://www.ams.org/msc/; the former refers to the 2000 version). The subject classifications should be comma-separated lists of the respective classification codes, e.g., ‘Classification/ACM: G.4, H.2.8, I.5.1’.

A ‘Language’ field can be used to indicate if the package documentation is not in English: this should be a comma-separated list of standard (not private use or grandfathered) IETF language tags as currently defined by RFC 5646 (https://tools.ietf.org/html/rfc5646, see also https://en.wikipedia.org/wiki/IETF_language_tag), i.e., use language subtags which in essence are 2-letter ISO 639-1 (https://en.wikipedia.org/wiki/ISO_639-1) or 3-letter ISO 639-3 (https://en.wikipedia.org/wiki/ISO_639-3) language codes.

An ‘RdMacros’ field can be used to hold a comma-separated list of packages from which the current package will import Rd macro definitions. These package should also be listed in ‘Imports’, ‘Suggests’ or ‘Depends’. The macros in these packages will be imported after the system macros, in the order listed in the ‘RdMacros’ field, before any macro definitions in the current package are loaded. Macro definitions in individual .Rd files in the man directory are loaded last, and are local to later parts of that file. In case of duplicates, the last loaded definition will be used8 Both R CMD Rd2pdf and R CMD Rdconv have an optional flag --RdMacros=pkglist. The option is also a comma-separated list of package names, and has priority over the value given in DESCRIPTION. Packages using Rd macros should depend on R 3.2.0 or later.

Note: There should be no ‘Built’ or ‘Packaged’ fields, as these are added by the package management tools.

There is no restriction on the use of other fields not mentioned here (but using other capitalizations of these field names would cause confusion). Fields Note, Contact (for contacting the authors/developers9) and MailingList are in common use. Some repositories (including CRAN and R-forge) add their own fields.


Next: , Previous: , Up: Package structure   [Contents][Index]

1.1.2 Licensing

Licensing for a package which might be distributed is an important but potentially complex subject.

It is very important that you include license information! Otherwise, it may not even be legally correct for others to distribute copies of the package, let alone use it.

The package management tools use the concept of ‘free or open source software’ (FOSS, e.g., https://en.wikipedia.org/wiki/FOSS) licenses: the idea being that some users of R and its packages want to restrict themselves to such software. Others need to ensure that there are no restrictions stopping them using a package, e.g. forbidding commercial or military use. It is a central tenet of FOSS software that there are no restrictions on users nor usage.

Do not use the ‘License’ field for information on copyright holders: if needed, use a ‘Copyright’ field.

The mandatory ‘License’ field in the DESCRIPTION file should specify the license of the package in a standardized form. Alternatives are indicated via vertical bars. Individual specifications must be one of

If a package license restricts a base license (where permitted, e.g., using GPL-3 or AGPL-3 with an attribution clause), the additional terms should be placed in file LICENSE (or LICENCE), and the string ‘+ file LICENSE’ (or ‘+ file LICENCE’, respectively) should be appended to the corresponding individual license specification. Note that several commonly used licenses do not permit restrictions: this includes GPL-2 and hence any specification which includes it.

Examples of standardized specifications include

License: GPL-2
License: LGPL (>= 2.0, < 3) | Mozilla Public License
License: GPL-2 | file LICENCE
License: GPL (>= 2) | BSD_3_clause + file LICENSE
License: Artistic-2.0 | AGPL-3 + file LICENSE

Please note in particular that “Public domain” is not a valid license, since it is not recognized in some jurisdictions.

Please ensure that the license you choose also covers any dependencies (including system dependencies) of your package: it is particularly important that any restrictions on the use of such dependencies are evident to people reading your DESCRIPTION file.

Fields ‘License_is_FOSS’ and ‘License_restricts_use’ may be added by repositories where information cannot be computed from the name of the license. ‘License_is_FOSS: yes’ is used for licenses which are known to be FOSS, and ‘License_restricts_use’ can have values ‘yes’ or ‘no’ if the LICENSE file is known to restrict users or usage, or known not to. These are used by, e.g., the available.packages filters.

The optional file LICENSE/LICENCE contains a copy of the license of the package. To avoid any confusion only include such a file if it is referred to in the ‘License’ field of the DESCRIPTION file.

Whereas you should feel free to include a license file in your source distribution, please do not arrange to install yet another copy of the GNU COPYING or COPYING.LIB files but refer to the copies on https://www.R-project.org/Licenses/ and included in the R distribution (in directory share/licenses). Since files named LICENSE or LICENCE will be installed, do not use these names for standard license files. To include comments about the licensing rather than the body of a license, use a file named something like LICENSE.note.

A few “standard” licenses are rather license templates which need additional information to be completed via+ file LICENSE’.


Next: , Previous: , Up: Package structure   [Contents][Index]

1.1.3 Package Dependencies

The ‘Depends’ field gives a comma-separated list of package names which this package depends on. Those packages will be attached before the current package when library or require is called. Each package name may be optionally followed by a comment in parentheses specifying a version requirement. The comment should contain a comparison operator, whitespace and a valid version number, e.g. ‘MASS (>= 3.1-20)’.

The ‘Depends’ field can also specify a dependence on a certain version of R — e.g., if the package works only with R version 3.0.0 or later, include ‘R (>= 3.0.0)’ in the ‘Depends’ field. You can also require a certain SVN revision for R-devel or R-patched, e.g. ‘R (>= 2.14.0), R (>= r56550)’ requires a version later than R-devel of late July 2011 (including released versions of 2.14.0).

It makes no sense to declare a dependence on R without a version specification, nor on the package base: this is an R package and package base is always available.

A package or ‘R’ can appear more than once in the ‘Depends’ field, for example to give upper and lower bounds on acceptable versions.

It is inadvisable to use a dependence on R with patchlevel (the third digit) other than zero. Doing so with packages which others depend on will cause the other packages to become unusable under earlier versions in the series, and e.g. versions 3.x.1 are widely used throughout the Northern Hemisphere academic year.

Both library and the R package checking facilities use this field: hence it is an error to use improper syntax or misuse the ‘Depends’ field for comments on other software that might be needed. The R INSTALL facilities check if the version of R used is recent enough for the package being installed, and the list of packages which is specified will be attached (after checking version requirements) before the current package.

The ‘Imports’ field lists packages whose namespaces are imported from (as specified in the NAMESPACE file) but which do not need to be attached. Namespaces accessed by the ‘::’ and ‘:::’ operators must be listed here, or in ‘Suggests’ or ‘Enhances’ (see below). Ideally this field will include all the standard packages that are used, and it is important to include S4-using packages (as their class definitions can change and the DESCRIPTION file is used to decide which packages to re-install when this happens). Packages declared in the ‘Depends’ field should not also be in the ‘Imports’ field. Version requirements can be specified and are checked when the namespace is loaded (since R >= 3.0.0).

The ‘Suggests’ field uses the same syntax as ‘Depends’ and lists packages that are not necessarily needed. This includes packages used only in examples, tests or vignettes (see Writing package vignettes), and packages loaded in the body of functions. E.g., suppose an example11 from package foo uses a dataset from package bar. Then it is not necessary to have bar use foo unless one wants to execute all the examples/tests/vignettes: it is useful to have bar, but not necessary. Version requirements can be specified but should be checked by the code which uses the package.

Finally, the ‘Enhances’ field lists packages “enhanced” by the package at hand, e.g., by providing methods for classes from these packages, or ways to handle objects from these packages (so several packages have ‘Enhances: chron’ because they can handle datetime objects from chron even though they prefer R’s native datetime functions). Version requirements can be specified, but are currently not used. Such packages cannot be required to check the package: any tests which use them must be conditional on the presence of the package. (If your tests use e.g. a dataset from another package it should be in ‘Suggests’ and not ‘Enhances’.)

The general rules are

In particular, packages providing “only” data for examples or vignettes should be listed in ‘Suggests’ rather than ‘Depends’ in order to make lean installations possible.

Version dependencies in the ‘Depends’ and ‘Imports’ fields are used by library when it loads the package, and install.packages checks versions for the ‘Depends’, ‘Imports’ and (for dependencies = TRUE) ‘Suggests’ fields.

It is increasingly important that the information in these fields is complete and accurate: it is for example used to compute which packages depend on an updated package and which packages can safely be installed in parallel.

This scheme was developed before all packages had namespaces (R 2.14.0 in October 2011), and good practice changed once that was in place.

Field ‘Depends’ should nowadays be used rarely, only for packages which are intended to be put on the search path to make their facilities available to the end user (and not to the package itself): for example it makes sense that a user of package latticeExtra would want the functions of package lattice made available.

Almost always packages mentioned in ‘Depends’ should also be imported from in the NAMESPACE file: this ensures that any needed parts of those packages are available when some other package imports the current package.

The ‘Imports’ field should not contain packages which are not imported from (via the NAMESPACE file or :: or ::: operators), as all the packages listed in that field need to be installed for the current package to be installed. (This is checked by R CMD check.)

R code in the package should call library or require only exceptionally. Such calls are never needed for packages listed in ‘Depends’ as they will already be on the search path. It used to be common practice to use require calls for packages listed in ‘Suggests’ in functions which used their functionality, but nowadays it is better to access such functionality via :: calls.

A package that wishes to make use of header files in other packages needs to declare them as a comma-separated list in the field ‘LinkingTo’ in the DESCRIPTION file. For example

LinkingTo: link1, link2

The ‘LinkingTo’ field can have a version requirement which is checked at installation.

Specifying a package in ‘LinkingTo’ suffices if these are C++ headers containing source code or static linking is done at installation: the packages do not need to be (and usually should not be) listed in the ‘Depends’ or ‘Imports’ fields. This includes CRAN package BH and almost all users of RcppArmadillo and RcppEigen.

For another use of ‘LinkingTo’ see Linking to native routines in other packages.

The ‘Additional_repositories’ field is a comma-separated list of repository URLs where the packages named in the other fields may be found. It is currently used by R CMD check to check that the packages can be found, at least as source packages (which can be installed on any platform).


Previous: , Up: Package Dependencies   [Contents][Index]

1.1.3.1 Suggested packages

Note that someone wanting to run the examples/tests/vignettes may not have a suggested package available (and it may not even be possible to install it for that platform). The recommendation used to be to make their use conditional via if(require("pkgname")): this is OK if that conditioning is done in examples/tests/vignettes, although using if(requireNamespace("pkgname")) is preferred, if possible.

However, using require for conditioning in package code is not good practice as it alters the search path for the rest of the session and relies on functions in that package not being masked by other require or library calls. It is better practice to use code like

   if (requireNamespace("rgl", quietly = TRUE)) {
      rgl::plot3d(...)
   } else {
      ## do something else not involving rgl.
   }

Note the use of rgl:: as that object would not necessarily be visible (and if it is, it need not be the one from that namespace: plot3d occurs in several other packages). If the intention is to give an error if the suggested package is not available, simply use e.g. rgl::plot3d.

Note that the recommendation to use suggested packages conditionally in tests does also apply to packages used to manage test suites: a notorious example was testthat which in version 1.0.0 contained illegal C++ code and hence could not be installed on standards-compliant platforms.

Some people have assumed that a ‘recommended’ package in ‘Suggests’ can safely be used unconditionally, but this is not so. (R can be installed without recommended packages, and which packages are ‘recommended’ may change.)

As noted above, packages in ‘Enhancesmust be used conditionally and hence objects within them should always be accessed via ::.


Next: , Previous: , Up: Package structure   [Contents][Index]

1.1.4 The INDEX file

The optional file INDEX contains a line for each sufficiently interesting object in the package, giving its name and a description (functions such as print methods not usually called explicitly might not be included). Normally this file is missing and the corresponding information is automatically generated from the documentation sources (using tools::Rdindex()) when installing from source.

The file is part of the information given by library(help = pkgname).

Rather than editing this file, it is preferable to put customized information about the package into an overview help page (see Documenting packages) and/or a vignette (see Writing package vignettes).


Next: , Previous: , Up: Package structure   [Contents][Index]

1.1.5 Package subdirectories

The R subdirectory contains R code files, only. The code files to be installed must start with an ASCII (lower or upper case) letter or digit and have one of the extensions13 .R, .S, .q, .r, or .s. We recommend using .R, as this extension seems to be not used by any other software. It should be possible to read in the files using source(), so R objects must be created by assignments. Note that there need be no connection between the name of the file and the R objects created by it. Ideally, the R code files should only directly assign R objects and definitely should not call functions with side effects such as require and options. If computations are required to create objects these can use code ‘earlier’ in the package (see the ‘Collate’ field) plus functions in the ‘Depends’ packages provided that the objects created do not depend on those packages except via namespace imports.

Two exceptions are allowed: if the R subdirectory contains a file sysdata.rda (a saved image of one or more R objects: please use suitable compression as suggested by tools::resaveRdaFiles, and see also the ‘SysDataCompressionDESCRIPTION field.) this will be lazy-loaded into the namespace environment – this is intended for system datasets that are not intended to be user-accessible via data. Also, files ending in ‘.in’ will be allowed in the R directory to allow a configure script to generate suitable files.

Only ASCII characters (and the control characters tab, formfeed, LF and CR) should be used in code files. Other characters are accepted in comments14, but then the comments may not be readable in e.g. a UTF-8 locale. Non-ASCII characters in object names will normally15 fail when the package is installed. Any byte will be allowed in a quoted character string but \uxxxx escapes should be used for non-ASCII characters. However, non-ASCII character strings may not be usable in some locales and may display incorrectly in others.

Various R functions in a package can be used to initialize and clean up. See Load hooks.

The man subdirectory should contain (only) documentation files for the objects in the package in R documentation (Rd) format. The documentation filenames must start with an ASCII (lower or upper case) letter or digit and have the extension .Rd (the default) or .rd. Further, the names must be valid in ‘file://’ URLs, which means16 they must be entirely ASCII and not contain ‘%’. See Writing R documentation files, for more information. Note that all user-level objects in a package should be documented; if a package pkg contains user-level objects which are for “internal” use only, it should provide a file pkg-internal.Rd which documents all such objects, and clearly states that these are not meant to be called by the user. See e.g. the sources for package grid in the R distribution. Note that packages which use internal objects extensively should not export those objects from their namespace, when they do not need to be documented (see Package namespaces).

Having a man directory containing no documentation files may give an installation error.

The man subdirectory may contain a subdirectory named macros; this will contain source for user-defined Rd macros. (See User-defined macros.) These use the Rd format, but may not contain anything but macro definitions, comments and whitespace.

The R and man subdirectories may contain OS-specific subdirectories named unix or windows.

The sources and headers for the compiled code are in src, plus optionally a file Makevars or Makefile. When a package is installed using R CMD INSTALL, make is used to control compilation and linking into a shared object for loading into R. There are default make variables and rules for this (determined when R is configured and recorded in R_HOME/etcR_ARCH/Makeconf), providing support for C, C++, FORTRAN 77, Fortran 9x17, Objective C and Objective C++18 with associated extensions .c, .cc or .cpp, .f, .f90 or .f95, .m, and .mm, respectively. We recommend using .h for headers, also for C++19 or Fortran 9x include files. (Use of extension .C for C++ is no longer supported.) Files in the src directory should not be hidden (start with a dot), and hidden files will under some versions of R be ignored.

It is not portable (and may not be possible at all) to mix all these languages in a single package, and we do not support using both C++ and Fortran 9x. Because R itself uses it, we know that C and FORTRAN 77 can be used together and mixing C and C++ seems to be widely successful.

If your code needs to depend on the platform there are certain defines which can used in C or C++. On all Windows builds (even 64-bit ones) ‘_WIN32’ will be defined: on 64-bit Windows builds also ‘_WIN64’, and on macOS ‘__APPLE__’ is defined.20

The default rules can be tweaked by setting macros21 in a file src/Makevars (see Using Makevars). Note that this mechanism should be general enough to eliminate the need for a package-specific src/Makefile. If such a file is to be distributed, considerable care is needed to make it general enough to work on all R platforms. If it has any targets at all, it should have an appropriate first target named ‘all’ and a (possibly empty) target ‘clean’ which removes all files generated by running make (to be used by ‘R CMD INSTALL --clean’ and ‘R CMD INSTALL --preclean’). There are platform-specific file names on Windows: src/Makevars.win takes precedence over src/Makevars and src/Makefile.win must be used. Some make programs require makefiles to have a complete final line, including a newline.

A few packages use the src directory for purposes other than making a shared object (e.g. to create executables). Such packages should have files src/Makefile and src/Makefile.win (unless intended for only Unix-alikes or only Windows).

In very special cases packages may create binary files other than the shared objects/DLLs in the src directory. Such files will not be installed in a multi-architecture setting since R CMD INSTALL --libs-only is used to merge multiple sub-architectures and it only copies shared objects/DLLs. If a package wants to install other binaries (for example executable programs), it should provide an R script src/install.libs.R which will be run as part of the installation in the src build directory instead of copying the shared objects/DLLs. The script is run in a separate R environment containing the following variables: R_PACKAGE_NAME (the name of the package), R_PACKAGE_SOURCE (the path to the source directory of the package), R_PACKAGE_DIR (the path of the target installation directory of the package), R_ARCH (the arch-dependent part of the path, often empty), SHLIB_EXT (the extension of shared objects) and WINDOWS (TRUE on Windows, FALSE elsewhere). Something close to the default behavior could be replicated with the following src/install.libs.R file:

files <- Sys.glob(paste0("*", SHLIB_EXT))
dest <- file.path(R_PACKAGE_DIR, paste0('libs', R_ARCH))
dir.create(dest, recursive = TRUE, showWarnings = FALSE)
file.copy(files, dest, overwrite = TRUE)
if(file.exists("symbols.rds"))
    file.copy("symbols.rds", dest, overwrite = TRUE)

On the other hand, executable programs could be installed along the lines of

execs <- c("one", "two", "three")
if(WINDOWS) execs <- paste0(execs, ".exe")
if ( any(file.exists(execs)) ) {
  dest <- file.path(R_PACKAGE_DIR,  paste0('bin', R_ARCH))
  dir.create(dest, recursive = TRUE, showWarnings = FALSE)
  file.copy(execs, dest, overwrite = TRUE)
}

Note the use of architecture-specific subdirectories of bin where needed.

The data subdirectory is for data files: See Data in packages.

The demo subdirectory is for R scripts (for running via demo()) that demonstrate some of the functionality of the package. Demos may be interactive and are not checked automatically, so if testing is desired use code in the tests directory to achieve this. The script files must start with a (lower or upper case) letter and have one of the extensions .R or .r. If present, the demo subdirectory should also have a 00Index file with one line for each demo, giving its name and a description separated by a tab or at least three spaces. (This index file is not generated automatically.) Note that a demo does not have a specified encoding and so should be an ASCII file (see Encoding issues). Function demo() will use the package encoding if there is one, but this is mainly useful for non-ASCII comments.

The contents of the inst subdirectory will be copied recursively to the installation directory. Subdirectories of inst should not interfere with those used by R (currently, R, data, demo, exec, libs, man, help, html and Meta, and earlier versions used latex, R-ex). The copying of the inst happens after src is built so its Makefile can create files to be installed. To exclude files from being installed, one can specify a list of exclude patterns in file .Rinstignore in the top-level source directory. These patterns should be Perl-like regular expressions (see the help for regexp in R for the precise details), one per line, to be matched case-insensitively against the file and directory paths, e.g. doc/.*[.]png$ will exclude all PNG files in inst/doc based on the extension.

Note that with the exceptions of INDEX, LICENSE/LICENCE and NEWS, information files at the top level of the package will not be installed and so not be known to users of Windows and macOS compiled packages (and not seen by those who use R CMD INSTALL or install.packages on the tarball). So any information files you wish an end user to see should be included in inst. Note that if the named exceptions also occur in inst, the version in inst will be that seen in the installed package.

Things you might like to add to inst are a CITATION file for use by the citation function, and a NEWS.Rd file for use by the news function. See its help page for the specific format restrictions of the NEWS.Rd file.

Another file sometimes needed in inst is AUTHORS or COPYRIGHTS to specify the authors or copyright holders when this is too complex to put in the DESCRIPTION file.

Subdirectory tests is for additional package-specific test code, similar to the specific tests that come with the R distribution. Test code can either be provided directly in a .R (or .r as from R 3.4.0) file, or via a .Rin file containing code which in turn creates the corresponding .R file (e.g., by collecting all function objects in the package and then calling them with the strangest arguments). The results of running a .R file are written to a .Rout file. If there is a corresponding22 .Rout.save file, these two are compared, with differences being reported but not causing an error. The directory tests is copied to the check area, and the tests are run with the copy as the working directory and with R_LIBS set to ensure that the copy of the package installed during testing will be found by library(pkg_name). Note that the package-specific tests are run in a vanilla R session without setting the random-number seed, so tests which use random numbers will need to set the seed to obtain reproducible results (and it can be helpful to do so in all cases, to avoid occasional failures when tests are run).

If directory tests has a subdirectory Examples containing a file pkg-Ex.Rout.save, this is compared to the output file for running the examples when the latter are checked. Reference output should be produced without having the --timings option set (and note that --as-cran sets it).

Subdirectory exec could contain additional executable scripts the package needs, typically scripts for interpreters such as the shell, Perl, or Tcl. NB: only files (and not directories) under exec are installed (and those with names starting with a dot are ignored), and they are all marked as executable (mode 755, moderated by ‘umask’) on POSIX platforms. Note too that this is not suitable for executable programs since some platforms (including Windows) support multiple architectures using the same installed package directory.

Subdirectory po is used for files related to localization: see Internationalization.

Subdirectory tools is the preferred place for auxiliary files needed during configuration, and also for sources need to re-create scripts (e.g. M4 files for autoconf).


Next: , Previous: , Up: Package structure   [Contents][Index]

1.1.6 Data in packages

The data subdirectory is for data files, either to be made available via lazy-loading or for loading using data(). (The choice is made by the ‘LazyData’ field in the DESCRIPTION file: the default is not to do so.) It should not be used for other data files needed by the package, and the convention has grown up to use directory inst/extdata for such files.

Data files can have one of three types as indicated by their extension: plain R code (.R or .r), tables (.tab, .txt, or .csv, see ?data for the file formats, and note that .csv is not the standard23 CSV format), or save() images (.RData or .rda). The files should not be hidden (have names starting with a dot). Note that R code should be “self-sufficient” and not make use of extra functionality provided by the package, so that the data file can also be used without having to load the package or its namespace.

Images (extensions .RData24 or .rda) can contain references to the namespaces of packages that were used to create them. Preferably there should be no such references in data files, and in any case they should only be to packages listed in the Depends and Imports fields, as otherwise it may be impossible to install the package. To check for such references, load all the images into a vanilla R session, and look at the output of loadedNamespaces().

If your data files are large and you are not using ‘LazyData’ you can speed up installation by providing a file datalist in the data subdirectory. This should have one line per topic that data() will find, in the format ‘foo’ if data(foo) provides ‘foo’, or ‘foo: bar bah’ if data(foo) provides ‘bar’ and ‘bah’. R CMD build will automatically add a datalist file to data directories of over 1Mb, using the function tools::add_datalist.

Tables (.tab, .txt, or .csv files) can be compressed by gzip, bzip2 or xz, optionally with additional extension .gz, .bz2 or .xz.

If your package is to be distributed, do consider the resource implications of large datasets for your users: they can make packages very slow to download and use up unwelcome amounts of storage space, as well as taking many seconds to load. It is normally best to distribute large datasets as .rda images prepared by save(, compress = TRUE) (the default). Using bzip2 or xz compression will usually reduce the size of both the package tarball and the installed package, in some cases by a factor of two or more.

Package tools has a couple of functions to help with data images: checkRdaFiles reports on the way the image was saved, and resaveRdaFiles will re-save with a different type of compression, including choosing the best type for that particular image.

Some packages using ‘LazyData’ will benefit from using a form of compression other than gzip in the installed lazy-loading database. This can be selected by the --data-compress option to R CMD INSTALL or by using the ‘LazyDataCompression’ field in the DESCRIPTION file. Useful values are bzip2, xz and the default, gzip. The only way to discover which is best is to try them all and look at the size of the pkgname/data/Rdata.rdb file.

Lazy-loading is not supported for very large datasets (those which when serialized exceed 2GB, the limit for the format on 32-bit platforms).

The analogue for sysdata.rda is field ‘SysDataCompression’: the default is xz for files bigger than 1MB otherwise gzip.


Next: , Previous: , Up: Package structure   [Contents][Index]

1.1.7 Non-R scripts in packages

Code which needs to be compiled (C, C++, FORTRAN, Fortran 95 …) is included in the src subdirectory and discussed elsewhere in this document.

Subdirectory exec could be used for scripts for interpreters such as the shell, BUGS, JavaScript, Matlab, Perl, php (amap), Python or Tcl (Simile), or even R. However, it seems more common to use the inst directory, for example WriteXLS/inst/Perl, NMF/inst/m-files, RnavGraph/inst/tcl, RProtoBuf/inst/python and emdbook/inst/BUGS and gridSVG/inst/js.

Java code is a special case: except for very small programs, .java files should be byte-compiled (to a .class file) and distributed as part of a .jar file: the conventional location for the .jar file(s) is inst/java. It is desirable (and required under an Open Source license) to make the Java source files available: this is best done in a top-level java directory in the package—the source files should not be installed.

If your package requires one of these interpreters or an extension then this should be declared in the ‘SystemRequirements’ field of its DESCRIPTION file. (Users of Java most often do so via rJava, when depending on/importing that suffices.)

Windows and Mac users should be aware that the Tcl extensions ‘BWidget’ and ‘Tktable’ which are currently included with the R for Windows and in the macOS installers are extensions and do need to be declared for users of other platforms (and that ‘Tktable’ is less widely available than it used to be, including not in the main repositories for major Linux distributions).

BWidget’ needs to be installed by the user on other OSes. This is fairly easy to do: first find the Tcl/Tk search path:

library(tcltk)
strsplit(tclvalue('auto_path'), " ")[[1]]

then download the sources from https://sourceforge.net/projects/tcllib/files/BWidget/ and at the command line run something like

tar xf bwidget-1.9.8.tar.gz
sudo mv bwidget-1.9.8 /usr/local/lib

substituting a location on the Tcl/Tk search path for /usr/local/lib if needed.


Previous: , Up: Package structure   [Contents][Index]

1.1.8 Specifying URLs

URLs in many places in the package documentation will be converted to clickable hyperlinks in at least some of their renderings. So care is needed that their forms are correct and portable.

The full URL should be given, including the scheme (often ‘http://’ or ‘https://’) and a final ‘/’ for references to directories.

Spaces in URLs are not portable and how they are handled does vary by HTTP server and by client. There should be no space in the host part of an ‘http://’ URL, and spaces in the remainder should be encoded, with each space replaced by ‘%20’.

Other characters may benefit from being encoded: see the help on URLencode().

The canonical URL for a CRAN package is

https://cran.r-project.org/package=pkgname

and not a version starting ‘https://cran.r-project.org/web/packages/pkgname’.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.2 Configure and cleanup

Note that most of this section is specific to Unix-alikes: see the comments later on about the Windows port of R.

If your package needs some system-dependent configuration before installation you can include an executable (Bourne25) shell script configure in your package which (if present) is executed by R CMD INSTALL before any other action is performed. This can be a script created by the Autoconf mechanism, but may also be a script written by yourself. Use this to detect if any nonstandard libraries are present such that corresponding code in the package can be disabled at install time rather than giving error messages when the package is compiled or used. To summarize, the full power of Autoconf is available for your extension package (including variable substitution, searching for libraries, etc.).

Under a Unix-alike only, an executable (Bourne shell) script cleanup is executed as the last thing by R CMD INSTALL if option --clean was given, and by R CMD build when preparing the package for building from its source.

As an example consider we want to use functionality provided by a (C or FORTRAN) library foo. Using Autoconf, we can create a configure script which checks for the library, sets variable HAVE_FOO to TRUE if it was found and to FALSE otherwise, and then substitutes this value into output files (by replacing instances of ‘@HAVE_FOO@’ in input files with the value of HAVE_FOO). For example, if a function named bar is to be made available by linking against library foo (i.e., using -lfoo), one could use

AC_CHECK_LIB(foo, fun, [HAVE_FOO=TRUE], [HAVE_FOO=FALSE])
AC_SUBST(HAVE_FOO)
......
AC_CONFIG_FILES([foo.R])
AC_OUTPUT

in configure.ac (assuming Autoconf 2.50 or later).

The definition of the respective R function in foo.R.in could be

foo <- function(x) {
    if(!@HAVE_FOO@)
      stop("Sorry, library ‘foo’ is not available")
    ...

From this file configure creates the actual R source file foo.R looking like

foo <- function(x) {
    if(!FALSE)
      stop("Sorry, library ‘foo’ is not available")
    ...

if library foo was not found (with the desired functionality). In this case, the above R code effectively disables the function.

One could also use different file fragments for available and missing functionality, respectively.

You will very likely need to ensure that the same C compiler and compiler flags are used in the configure tests as when compiling R or your package. Under a Unix-alike, you can achieve this by including the following fragment early in configure.ac (before calling AC_PROG_CC)

: ${R_HOME=`R RHOME`}
if test -z "${R_HOME}"; then
  echo "could not determine R_HOME"
  exit 1
fi
CC=`"${R_HOME}/bin/R" CMD config CC`
CFLAGS=`"${R_HOME}/bin/R" CMD config CFLAGS`
CPPFLAGS=`"${R_HOME}/bin/R" CMD config CPPFLAGS`

(Using ‘${R_HOME}/bin/R’ rather than just ‘R’ is necessary in order to use the correct version of R when running the script as part of R CMD INSTALL, and the quotes since ‘${R_HOME}’ might contain spaces.)

If your code does load checks then you may also need

LDFLAGS=`"${R_HOME}/bin/R" CMD config LDFLAGS`

and packages written with C++ need to pick up the details for the C++ compiler and switch the current language to C++ by something like

CXX=`"${R_HOME}/bin/R" CMD config CXX`
CXXFLAGS=`"${R_HOME}/bin/R" CMD config CXXFLAGS`
AC_LANG(C++)

The latter is important, as for example C headers may not be available to C++ programs or may not be written to avoid C++ name-mangling.

You can use R CMD config for getting the value of the basic configuration variables, and also the header and library flags necessary for linking a front-end executable program against R, see R CMD config --help for details.

To check for an external BLAS library using the ACX_BLAS macro from the official Autoconf Macro Archive, one can simply do

F77=`"${R_HOME}/bin/R" CMD config F77`
AC_PROG_F77
FLIBS=`"${R_HOME}/bin/R" CMD config FLIBS`
ACX_BLAS([], AC_MSG_ERROR([could not find your BLAS library], 1))

Note that FLIBS as determined by R must be used to ensure that FORTRAN 77 code works on all R platforms. Calls to the Autoconf macro AC_F77_LIBRARY_LDFLAGS, which would overwrite FLIBS, must not be used (and hence e.g. removed from ACX_BLAS). (Recent versions of Autoconf in fact allow an already set FLIBS to override the test for the FORTRAN linker flags.)

N.B.: If the configure script creates files, e.g. src/Makevars, you do need a cleanup script to remove them. Otherwise R CMD build may ship the files that are created. For example, package RODBC has

#!/bin/sh

rm -f config.* src/Makevars src/config.h

As this example shows, configure often creates working files such as config.log.

If your configure script needs auxiliary files, it is recommended that you ship them in a tools directory (as R itself does).

You should bear in mind that the configure script will not be used on Windows systems. If your package is to be made publicly available, please give enough information for a user on a non-Unix-alike platform to configure it manually, or provide a configure.win script to be used on that platform. (Optionally, there can be a cleanup.win script. Both should be shell scripts to be executed by ash, which is a minimal version of Bourne-style sh.) When configure.win is run the environment variables R_HOME (which uses ‘/’ as the file separator), R_ARCH and Use R_ARCH_BIN will be set. Use R_ARCH to decide if this is a 64-bit build (its value there is ‘/x64’) and to install DLLs to the correct place (${R_HOME}/libs${R_ARCH}). Use R_ARCH_BIN to find the correct place under the bin directory, e.g. ${R_HOME}/bin${R_ARCH_BIN}/Rscript.exe.

In some rare circumstances, the configuration and cleanup scripts need to know the location into which the package is being installed. An example of this is a package that uses C code and creates two shared object/DLLs. Usually, the object that is dynamically loaded by R is linked against the second, dependent, object. On some systems, we can add the location of this dependent object to the object that is dynamically loaded by R. This means that each user does not have to set the value of the LD_LIBRARY_PATH (or equivalent) environment variable, but that the secondary object is automatically resolved. Another example is when a package installs support files that are required at run time, and their location is substituted into an R data structure at installation time. The names of the top-level library directory (i.e., specifiable via the ‘-l’ argument) and the directory of the package itself are made available to the installation scripts via the two shell/environment variables R_LIBRARY_DIR and R_PACKAGE_DIR. Additionally, the name of the package (e.g. ‘survival’ or ‘MASS’) being installed is available from the environment variable R_PACKAGE_NAME. (Currently the value of R_PACKAGE_DIR is always ${R_LIBRARY_DIR}/${R_PACKAGE_NAME}, but this used not to be the case when versioned installs were allowed. Its main use is in configure.win scripts for the installation path of external software’s DLLs.) Note that the value of R_PACKAGE_DIR may contain spaces and other shell-unfriendly characters, and so should be quoted in makefiles and configure scripts.

One of the more tricky tasks can be to find the headers and libraries of external software. One tool which is increasingly available on Unix-alikes (but not by default on macOS) to do this is pkg-config. The configure script will need to test for the presence of the command itself (see for example package Cairo), and if present it can be asked if the software is installed, of a suitable version and for compilation/linking flags by e.g.

$ pkg-config --exists ‘QtCore >= 4.0.0’  # check the status
$ pkg-config --modversion QtCore
4.7.1
$ pkg-config --cflags QtCore
-DQT_SHARED -I/usr/include/QtCore
$ pkg-config --libs QtCore
-lQtCore

Note that pkg-config --libs gives the information required to link against the default version of that library (usually the dynamic one), and pkg-config --static is needed if the static library is to be used.

Sometimes the name by which the software is known to pkg-config is not what one might expect (e.g. ‘gtk+-2.0’ even for 2.22). To get a complete list use

pkg-config --list-all | sort

Next: , Previous: , Up: Configure and cleanup   [Contents][Index]

1.2.1 Using Makevars

Sometimes writing your own configure script can be avoided by supplying a file Makevars: also one of the most common uses of a configure script is to make Makevars from Makevars.in.

A Makevars file is a makefile and is used as one of several makefiles by R CMD SHLIB (which is called by R CMD INSTALL to compile code in the src directory). It should be written if at all possible in a portable style, in particular (except for Makevars.win) without the use of GNU extensions.

The most common use of a Makevars file is to set additional preprocessor options (for example include paths) for C/C++ files via PKG_CPPFLAGS, and additional compiler flags by setting PKG_CFLAGS, PKG_CXXFLAGS, PKG_FFLAGS or PKG_FCFLAGS, for C, C++, FORTRAN or Fortran 9x respectively (see Creating shared objects).

N.B.: Include paths are preprocessor options, not compiler options, and must be set in PKG_CPPFLAGS as otherwise platform-specific paths (e.g. ‘-I/usr/local/include’) will take precedence.

Makevars can also be used to set flags for the linker, for example ‘-L’ and ‘-l’ options, via PKG_LIBS.

When writing a Makevars file for a package you intend to distribute, take care to ensure that it is not specific to your compiler: flags such as -O2 -Wall -pedantic (and all other -W flags: for the Oracle compilers these are used to pass arguments to compiler phases) are all specific to GCC.

Also, do not set variables such as CPPFLAGS, CFLAGS etc.: these should be settable by users (sites) through appropriate personal (site-wide) Makevars files. See Customizing package compilation in R Installation and Administration,

There are some macros26 which are set whilst configuring the building of R itself and are stored in R_HOME/etcR_ARCH/Makeconf. That makefile is included as a Makefile after Makevars[.win], and the macros it defines can be used in macro assignments and make command lines in the latter. These include

FLIBS

A macro containing the set of libraries need to link FORTRAN code. This may need to be included in PKG_LIBS: it will normally be included automatically if the package contains FORTRAN source files.

BLAS_LIBS

A macro containing the BLAS libraries used when building R. This may need to be included in PKG_LIBS. Beware that if it is empty then the R executable will contain all the double-precision and double-complex BLAS routines, but no single-precision nor complex routines. If BLAS_LIBS is included, then FLIBS also needs to be27 included following it, as most BLAS libraries are written at least partially in FORTRAN.

LAPACK_LIBS

A macro containing the LAPACK libraries (and paths where appropriate) used when building R. This may need to be included in PKG_LIBS. It may point to a dynamic library libRlapack which contains the main double-precision LAPACK routines as well as those double-complex LAPACK routines needed to build R, or it may point to an external LAPACK library, or may be empty if an external BLAS library also contains LAPACK.

[libRlapack includes all the double-precision LAPACK routines which were current in 2003: a list of which routines are included is in file src/modules/lapack/README. Note that an external LAPACK/BLAS library need not do so, as some were ‘deprecated’ (and not compiled by default) in LAPACK 3.6.0 in late 2015.]

For portability, the macros BLAS_LIBS and FLIBS should always be included after LAPACK_LIBS (and in that order).

SAFE_FFLAGS

A macro containing flags which are needed to circumvent over-optimization of FORTRAN code: it is typically ‘-g -O2 -ffloat-store’ on ‘ix86’ platforms using gfortran. Note that this is not an additional flag to be used as part of PKG_FFLAGS, but a replacement for FFLAGS, and that it is intended for the FORTRAN 77 compiler ‘F77’ and not necessarily for the Fortran 90/95 compiler ‘FC’. See the example later in this section.

Setting certain macros in Makevars will prevent R CMD SHLIB setting them: in particular if Makevars sets ‘OBJECTS’ it will not be set on the make command line. This can be useful in conjunction with implicit rules to allow other types of source code to be compiled and included in the shared object. It can also be used to control the set of files which are compiled, either by excluding some files in src or including some files in subdirectories. For example

OBJECTS = 4dfp/endianio.o 4dfp/Getifh.o R4dfp-object.o

Note that Makevars should not normally contain targets, as it is included before the default makefile and make will call the first target, intended to be all in the default makefile. If you really need to circumvent that, use a suitable (phony) target all before any actual targets in Makevars.[win]: for example package fastICA used to have

PKG_LIBS = @BLAS_LIBS@

SLAMC_FFLAGS=$(R_XTRA_FFLAGS) $(FPICFLAGS) $(SHLIB_FFLAGS) $(SAFE_FFLAGS)

all: $(SHLIB)

slamc.o: slamc.f
        $(F77) $(SLAMC_FFLAGS) -c -o slamc.o slamc.f

needed to ensure that the LAPACK routines find some constants without infinite looping. The Windows equivalent was

all: $(SHLIB)

slamc.o: slamc.f
        $(F77) $(SAFE_FFLAGS) -c -o slamc.o slamc.f

(since the other macros are all empty on that platform, and R’s internal BLAS was not used). Note that the first target in Makevars will be called, but for back-compatibility it is best named all.

If you want to create and then link to a library, say using code in a subdirectory, use something like

.PHONY: all mylibs

all: $(SHLIB)
$(SHLIB): mylibs

mylibs:
        (cd subdir; $(MAKE))

Be careful to create all the necessary dependencies, as there is no guarantee that the dependencies of all will be run in a particular order (and some of the CRAN build machines use multiple CPUs and parallel makes). In particular,

all: mylibs

does not suffice.

Note that on Windows it is required that Makevars[.win] does create a DLL: this is needed as it is the only reliable way to ensure that building a DLL succeeded. If you want to use the src directory for some purpose other than building a DLL, use a Makefile.win file.

It is sometimes useful to have a target ‘clean’ in Makevars or Makevars.win: this will be used by R CMD build to clean up (a copy of) the package sources. When it is run by build it will have fewer macros set, in particular not $(SHLIB), nor $(OBJECTS) unless set in the file itself. It would also be possible to add tasks to the target ‘shlib-clean’ which is run by R CMD INSTALL and R CMD SHLIB with options --clean and --preclean.

If you want to run R code in Makevars, e.g. to find configuration information, please do ensure that you use the correct copy of R or Rscript: there might not be one in the path at all, or it might be the wrong version or architecture. The correct way to do this is via

"$(R_HOME)/bin$(R_ARCH_BIN)/Rscript" filename
"$(R_HOME)/bin$(R_ARCH_BIN)/Rscript" -e ‘R expression

where $(R_ARCH_BIN) is only needed currently on Windows.

Environment or make variables can be used to select different macros for 32- and 64-bit code, for example (GNU make syntax, allowed on Windows)

ifeq "$(WIN)" "64"
PKG_LIBS = value for 64-bit Windows
else
PKG_LIBS = value for 32-bit Windows
endif

On Windows there is normally a choice between linking to an import library or directly to a DLL. Where possible, the latter is much more reliable: import libraries are tied to a specific toolchain, and in particular on 64-bit Windows two different conventions have been commonly used. So for example instead of

PKG_LIBS = -L$(XML_DIR)/lib -lxml2

one can use

PKG_LIBS = -L$(XML_DIR)/bin -lxml2

since on Windows -lxxx will look in turn for

libxxx.dll.a
xxx.dll.a
libxxx.a
xxx.lib
libxxx.dll
xxx.dll

where the first and second are conventionally import libraries, the third and fourth often static libraries (with .lib intended for Visual C++), but might be import libraries. See for example https://sourceware.org/binutils/docs-2.20/ld/WIN32.html#WIN32.

The fly in the ointment is that the DLL might not be named libxxx.dll, and in fact on 32-bit Windows there is a libxml2.dll whereas on one build for 64-bit Windows the DLL is called libxml2-2.dll. Using import libraries can cover over these differences but can cause equal difficulties.

If static libraries are available they can save a lot of problems with run-time finding of DLLs, especially when binary packages are to be distributed and even more when these support both architectures. Where using DLLs is unavoidable we normally arrange (via configure.win) to ship them in the same directory as the package DLL.


Next: , Previous: , Up: Using Makevars   [Contents][Index]

1.2.1.1 OpenMP support

There is some support for packages which wish to use OpenMP28. The make macros

SHLIB_OPENMP_CFLAGS
SHLIB_OPENMP_CXXFLAGS
SHLIB_OPENMP_FCFLAGS
SHLIB_OPENMP_FFLAGS

are available for use in src/Makevars or src/Makevars.win. Include the appropriate macro in PKG_CFLAGS, PKG_CPPFLAGS and so on, and also in PKG_LIBS. C/C++ code that needs to be conditioned on the use of OpenMP can be used inside #ifdef _OPENMP: note that some toolchains used for R (including that of macOS and some others using clang29) have no OpenMP support at all, not even omp.h.

For example, a package with C code written for OpenMP should have in src/Makevars the lines

PKG_CFLAGS = $(SHLIB_OPENMP_CFLAGS)
PKG_LIBS = $(SHLIB_OPENMP_CFLAGS)

Note that the macro SHLIB_OPENMP_CXXFLAGS applies to the default C++ compiler and not necessarily to the C++11/14/17 compiler: users of the latter should do their own configure checks (an example is available in CRAN package ARTP2).

Some care is needed when compilers are from different families which may use different OpenMP runtimes (e.g. clang vs GCC including gfortran, although it is currently possible to use the clang runtime with GCC but not vice versa). For a package with Fortran 77 code using OpenMP the appropriate lines are

PKG_FFLAGS = $(SHLIB_OPENMP_FFLAGS)
PKG_LIBS = $(SHLIB_OPENMP_CFLAGS)

as the C compiler will be used to link the package code (and there is no guarantee that this will work everywhere). (This does not apply to Fortran 9x code, where SHLIB_OPENMP_FCFLAGS should be used in both PKG_FCFLAGS and PKG_LIBS.)

For portability, any C/C++ code using the omp_* functions should include the omp.h header: some compilers (but not all) include it when OpenMP mode is switched on (e.g. via flag -fopenmp).

There is nothing30 to say what version of OpenMP is supported: version 3.1 (and much of 4.0) is supported by recent versions31 of the Linux, Windows and Solaris platforms, but portable packages cannot assume that end users have recent versions.32 macOS currently uses Apple builds of clang with no OpenMP support (even if invoked as gcc and despite the man page including the flag -fopenmp for that command). http://www.openmp.org/resources/openmp-compilers gives some idea of what compilers support what versions.

The performance of OpenMP varies substantially between platforms. The Windows implementation has substantial overheads33, so is only beneficial if quite substantial tasks are run in parallel. Also, on Windows new threads are started with the default34 FPU control word, so computations done on OpenMP threads will not make use of extended-precision arithmetic which is the default for the main process.

Calling any of the R API from threaded code is ‘for experts only’: they will need to read the source code to determine if it is thread-safe. In particular, code which makes use of the stack-checking mechanism must not be called from threaded code.

Packages are not standard-alone programs, and an R process could contain more than one OpenMP-enabled package as well as other components (for example, an optimized BLAS) making use of OpenMP. So careful consideration needs to be given to resource usage. OpenMP works with parallel regions, and for most implementations the default is to use as many threads as ‘CPUs’ for such regions. Parallel regions can be nested, although it is common to use only a single thread below the first level. The correctness of the detected number of ‘CPUs’ and the assumption that the R process is entitled to use them all are both dubious assumptions. The best way to limit resources is to limit the overall number of threads available to OpenMP in the R process: this can be done via environment variable OMP_THREAD_LIMIT, where implemented.35 Alternatively, the number of threads per region can be limited by the environment variable OMP_NUM_THREADS or API call omp_set_num_threads, or, better, for the regions in your code as part of their specification. E.g. R uses

#pragma omp parallel for num_threads(nthreads) …

That way you only control your own code and not that of other OpenMP users.


Next: , Previous: , Up: Using Makevars   [Contents][Index]

1.2.1.2 Using pthreads

There is no direct support for the POSIX threads (more commonly known as pthreads): by the time we considered adding it several packages were using it unconditionally so it seems that nowadays it is universally available on POSIX operating systems (hence not Windows).

For reasonably recent versions of gcc and clang the correct specification is

PKG_CPPFLAGS = -pthread
PKG_LIBS = -pthread

(and the plural version is also accepted on some systems/versions). For other platforms the specification is

PKG_CPPFLAGS = -D_REENTRANT
PKG_LIBS = -lpthread

(and note that the library name is singular). This is what -pthread does on all known current platforms (although earlier versions of OpenBSD used a different library name).

For a tutorial see https://computing.llnl.gov/tutorials/pthreads/.

POSIX threads are not normally used on Windows, which has its own native concepts of threads. However, there are two projects implementing pthreads on top of Windows, pthreads-w32 and winpthreads (part of the MinGW-w64 project).

Whether Windows toolchains implement pthreads is up to the toolchain provider. A make variable SHLIB_PTHREAD_FLAGS is available: this should be included in both PKG_CPPFLAGS (or the Fortran or F9x equivalents) and PKG_LIBS.

The presence of a working pthreads implementation cannot be unambiguously determined without testing for yourself: however, that ‘_REENTRANT’ is defined36 in C/C++ code is a good indication.

Note that not all pthreads implementations are equivalent as parts are optional (see http://pubs.opengroup.org/onlinepubs/009695399/basedefs/pthread.h.html): for example, macOS lacks the ‘Barriers’ option.

See also the comments on thread-safety and performance under OpenMP: on all known R platforms OpenMP is implemented via pthreads and the known performance issues are in the latter.


Previous: , Up: Using Makevars   [Contents][Index]

1.2.1.3 Compiling in sub-directories

Package authors fairly often want to organize code in sub-directories of src, for example if they are including a separate piece of external software to which this is an R interface.

One simple way is simply to set OBJECTS to be all the objects that need to be compiled, including in sub-directories. For example, CRAN package RSiena has

SOURCES = $(wildcard data/*.cpp network/*.cpp utils/*.cpp model/*.cpp model/*/*.cpp model/*/*/*.cpp)

OBJECTS = siena07utilities.o siena07internals.o siena07setup.o siena07models.o $(SOURCES:.cpp=.o)

One problem with that approach is that unless GNU make extensions are used, the source files need to be listed and kept up-to-date. As in the following from CRAN package lossDev:

OBJECTS.samplers = samplers/ExpandableArray.o samplers/Knots.o \
  samplers/RJumpSpline.o samplers/RJumpSplineFactory.o \
  samplers/RealSlicerOV.o samplers/SliceFactoryOV.o samplers/MNorm.o
OBJECTS.distributions = distributions/DSpline.o \
  distributions/DChisqrOV.o distributions/DTOV.o \
  distributions/DNormOV.o distributions/DUnifOV.o distributions/RScalarDist.o
OBJECTS.root = RJump.o

OBJECTS = $(OBJECTS.samplers) $(OBJECTS.distributions) $(OBJECTS.root)

Where the subdirectory is self-contained code with a suitable makefile, the best approach is something like

PKG_LIBS = -LCsdp/lib -lsdp $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)

$(SHLIB): Csdp/lib/libsdp.a

Csdp/lib/libsdp.a:      
        @(cd Csdp/lib && $(MAKE) libsdp.a \
          CC="$(CC)" CFLAGS="$(CFLAGS) $(CPICFLAGS)" AR="$(AR)" RANLIB="$(RANLIB)")

Note the quotes: the macros can contain spaces, e.g. CC = "gcc -m64 -std=gnu99". Several authors have forgotten about parallel makes: the static library in the subdirectory must be made before the shared object ($(SHLIB)) and so the latter must depend on the former. Others forget the need37 for position-independent code.

We really do not recommend using src/Makefile instead of src/Makevars, and as the example above shows, it is not necessary.


Next: , Previous: , Up: Configure and cleanup   [Contents][Index]

1.2.2 Configure example

It may be helpful to give an extended example of using a configure script to create a src/Makevars file: this is based on that in the RODBC package.

The configure.ac file follows: configure is created from this by running autoconf in the top-level package directory (containing configure.ac).

AC_INIT([RODBC], 1.1.8) dnl package name, version

dnl A user-specifiable option
odbc_mgr=""
AC_ARG_WITH([odbc-manager],
            AC_HELP_STRING([--with-odbc-manager=MGR],
                           [specify the ODBC manager, e.g. odbc or iodbc]),
            [odbc_mgr=$withval])

if test "$odbc_mgr" = "odbc" ; then
  AC_PATH_PROGS(ODBC_CONFIG, odbc_config)
fi

dnl Select an optional include path, from a configure option
dnl or from an environment variable.
AC_ARG_WITH([odbc-include],
            AC_HELP_STRING([--with-odbc-include=INCLUDE_PATH],
                           [the location of ODBC header files]),
            [odbc_include_path=$withval])
RODBC_CPPFLAGS="-I."
if test [ -n "$odbc_include_path" ] ; then
   RODBC_CPPFLAGS="-I. -I${odbc_include_path}"
else
  if test [ -n "${ODBC_INCLUDE}" ] ; then
     RODBC_CPPFLAGS="-I. -I${ODBC_INCLUDE}"
  fi
fi

dnl ditto for a library path
AC_ARG_WITH([odbc-lib],
            AC_HELP_STRING([--with-odbc-lib=LIB_PATH],
                           [the location of ODBC libraries]),
            [odbc_lib_path=$withval])
if test [ -n "$odbc_lib_path" ] ; then
   LIBS="-L$odbc_lib_path ${LIBS}"
else
  if test [ -n "${ODBC_LIBS}" ] ; then
     LIBS="-L${ODBC_LIBS} ${LIBS}"
  else
    if test -n "${ODBC_CONFIG}"; then
      odbc_lib_path=`odbc_config --libs | sed s/-lodbc//`
      LIBS="${odbc_lib_path} ${LIBS}"
    fi
  fi
fi

dnl Now find the compiler and compiler flags to use
: ${R_HOME=`R RHOME`}
if test -z "${R_HOME}"; then
  echo "could not determine R_HOME"
  exit 1
fi
CC=`"${R_HOME}/bin/R" CMD config CC`
CPP=`"${R_HOME}/bin/R" CMD config CPP`
CFLAGS=`"${R_HOME}/bin/R" CMD config CFLAGS`
CPPFLAGS=`"${R_HOME}/bin/R" CMD config CPPFLAGS`
AC_PROG_CC
AC_PROG_CPP


if test -n "${ODBC_CONFIG}"; then
  RODBC_CPPFLAGS=`odbc_config --cflags`
fi
CPPFLAGS="${CPPFLAGS} ${RODBC_CPPFLAGS}"

dnl Check the headers can be found
AC_CHECK_HEADERS(sql.h sqlext.h)
if test "${ac_cv_header_sql_h}" = no ||
   test "${ac_cv_header_sqlext_h}" = no; then
   AC_MSG_ERROR("ODBC headers sql.h and sqlext.h not found")
fi

dnl search for a library containing an ODBC function
if test [ -n "${odbc_mgr}" ] ; then
  AC_SEARCH_LIBS(SQLTables, ${odbc_mgr}, ,
      AC_MSG_ERROR("ODBC driver manager ${odbc_mgr} not found"))
else
  AC_SEARCH_LIBS(SQLTables, odbc odbc32 iodbc, ,
      AC_MSG_ERROR("no ODBC driver manager found"))
fi

dnl for 64-bit ODBC need SQL[U]LEN, and it is unclear where they are defined.
AC_CHECK_TYPES([SQLLEN, SQLULEN], , , [# include <sql.h>])
dnl for unixODBC header
AC_CHECK_SIZEOF(long, 4)

dnl substitute RODBC_CPPFLAGS and LIBS
AC_SUBST(RODBC_CPPFLAGS)
AC_SUBST(LIBS)
AC_CONFIG_HEADERS([src/config.h])
dnl and do substitution in the src/Makevars.in and src/config.h
AC_CONFIG_FILES([src/Makevars])
AC_OUTPUT

where src/Makevars.in would be simply

PKG_CPPFLAGS = @RODBC_CPPFLAGS@
PKG_LIBS = @LIBS@

A user can then be advised to specify the location of the ODBC driver manager files by options like (lines broken for easier reading)

R CMD INSTALL \
  --configure-args='--with-odbc-include=/opt/local/include \
  --with-odbc-lib=/opt/local/lib --with-odbc-manager=iodbc' \
  RODBC

or by setting the environment variables ODBC_INCLUDE and ODBC_LIBS.


Next: , Previous: , Up: Configure and cleanup   [Contents][Index]

1.2.3 Using F95 code

R assumes that source files with extension .f are FORTRAN 77, and passes them to the compiler specified by ‘F77’. On most but not all platforms that compiler will accept Fortran 90/95 code: some platforms have a separate Fortran 90/95 compiler and a few (by now quite rare38) platforms have no Fortran 90/95 support.

This means that portable packages need to be written in correct FORTRAN 77, which will also be valid Fortran 95. See https://developer.R-project.org/Portability.html for reference resources. In particular, free source form F95 code is not portable.

On some systems an alternative F95 compiler is available: from the gcc family this might be gfortran or g95. Configuring R will try to find a compiler which (from its name) appears to be a Fortran 90/95 compiler, and set it in macro ‘FC’. Note that it does not check that such a compiler is fully (or even partially) compliant with Fortran 90/95. Packages making use of Fortran 90/95 features should use file extension .f90 or .f95 for the source files: the variable PKG_FCFLAGS specifies any special flags to be used. There is no guarantee that compiled Fortran 90/95 code can be mixed with any other type of compiled code, nor that a build of R will have support for such packages.

Some (but not) all compilers specified by the ‘FC’ macro will accept Fortran 2003 or 2008 code: such code should still use file extension .f90 or .f95. For platforms using gfortran, you may need to include -std=f2003 or -std=f2008 in PKG_FCFLAGS: the default is ‘GNU Fortran’, Fortran 95 with non-standard extensions. The Oracle f95 compiler ‘accepts some Fortran 2003/8 features’ (search for ‘Oracle Developer Studio 12.5: Fortran User’s Guide’ and look for §4.6).

Modern versions of Fortran support modules, whereby compiling one source file creates a module file which is then included in others. (Module files typically have a .mod extension: they do depend on the compiler used and so should never be included in a package.) This creates a dependence which make will not know about and often causes installation with a parallel make to fail. Thus it is necessary to add explicit dependencies to src/Makevars to tell make the constraints on the order of compilation. For example, if file iface.f90 creates a module ‘iface’ used by files cmi.f90 and dmi.f90 then src/Makevars needs to contain something like

cmi.o dmi.o: iface.o

Next: , Previous: , Up: Configure and cleanup   [Contents][Index]

1.2.4 Using C++11 code

R can be built without a C++ compiler although one is available (but not necessarily installed) on all known R platforms. For full portability across platforms, all that can be assumed is approximate support for the C++98 standard (the widely used g++ deviates considerably from the standard). Some compilers have a concept of ‘C++03’ (‘essentially a bug fix’) or ‘C++ Technical Report 1’ (TR1), an optional addition to the ‘C++03’ revision which was published in 2007. A revised standard was published in 2011 and compilers with pretty much complete implementations are available. C++11 added all of the C99 features which are not otherwise implemented in C++, and C++ compilers commonly accept C99 extensions to C++98. A minor update39 to C++11 (C++14) was published in December 2014. The next standard (C++17) was approved in Sept 2017 and will be published by ISO in due course.

What standard a C++ compiler aims to support can be hard to determine: the value40 of __cplusplus may help but some compilers use it to denote a standard which is partially supported and some the latest standard which is (almost) fully supported. As from version 6, g++ defaults to C++14 (with GNU extensions): earlier versions aim to support C++03 with many extensions (including support for TR1). clang with its native41 libc++ headers and library includes most C++14 features, and does not support TR1. As from version 6.0.0, clang is expected to default to C++14.

Since version 3.1.0, R has provided support for C++11 in packages in addition to C++98. This support is not uniform across platforms as it depends on the capabilities of the compiler (see below). When R is configured, it will determine whether the C++ compiler supports C++11 and which compiler flags, if any, are required to enable C++11 support. For example, recent versions of g++ or clang++ accept the compiler flag -std=c++11, and earlier versions support a flag -std=c++0x, but the latter only provided partial support for the C++11 standard (it later became a deprecated synonym for -std=c++11).

In order to use C++11 code in a package, the package’s Makevars file (or Makevars.win on Windows) should include the line

CXX_STD = CXX11

Compilation and linking will then be done with the C++11 compiler.

Packages without a src/Makevars or src/Makefile file may specify that they require C++11 for code in the src directory by including ‘C++11’ in the ‘SystemRequirements’ field of the DESCRIPTION file, e.g.

SystemRequirements: C++11

If a package does have a src/Makevars[.win] file then setting the make variable ‘CXX_STD’ is preferred, as it allows R CMD SHLIB to work correctly in the package’s src directory.

Conversely, to ensure that the C++98 standard is assumed even when this is not the compiler default, use

SystemRequirements: C++98

or

CXX_STD = CXX98

The C++11 compiler will be used systematically by R for all C++ code if the environment variable USE_CXX11 is defined (with any value). Hence this environment variable should be defined when invoking R CMD SHLIB in the absence of a Makevars file (or Makevars.win on Windows) if a C++11 compiler is required.

Further control over compilation of C++11 code can be obtained by specifying the macros ‘CXX11’ and ‘CXX11STD’ when R is configured42, or in a personal or site Makevars file. See Customizing package compilation in R Installation and Administration. If C++11 support is not available then these macros are both empty; if it is available by default, ‘CXX11’ defaults to ‘CXX’ and ‘CXX11STD’ is empty . Otherwise, ‘CXX11’ defaults to the same value as the C++ compiler ‘CXX’ and the flag ‘CXX11STD’ defaults to -std=c++11 or similar. It is possible to specify ‘CXX11’ to be a distinct compiler just for C++11–using packages, e.g. g++ on Solaris. Note however that different C++ compilers (and even different versions of the same compiler) often differ in their ABI so their outputs can rarely be mixed. By setting ‘CXX11STD’ it is also possible to choose a different dialect of the standard such as -std=c++11.

As noted above, support for C++11 varies across platforms: on some platforms, it may be possible or necessary to select a different compiler for C++11, via personal or site Makevars files.

There is no guarantee that C++11 can be used in a package in combination with any other compiled language (even C), as the C++11 compiler may be incompatible with the native compilers for the platform. (There are known problems mixing C++11 with Fortran.)

If a package using C++11 has a configure script it is essential that it selects the correct compiler, via something like

CXX11=`"${R_HOME}/bin/R" CMD config CXX11`
CXX11STD=`"${R_HOME}/bin/R" CMD config CXX11STD`
CXX="${CXX11} ${CXX11STD}"
CXXFLAGS=`"${R_HOME}/bin/R" CMD config CXX11FLAGS`
AC_LANG(C++)

(paying attention to all the quotes required).

If you want to compile C++11 code in a subdirectory, make sure you pass down the macros to specify that compiler, e.g. in src/Makevars

sublibs:
         @(cd libs && $(MAKE) \
            CXX="$(CXX11) $(CXX11STD)" CXXFLAGS="$(CXX11FLAGS) $(CXX11PICFLAGS)")

Note that the mechanisms described here specify C++11 for code compiled by R CMD SHLIB as used by default by R CMD INSTALL. They do not necessarily apply if there is a src/Makefile file, nor to compilation done in vignettes or via other packages.


Next: , Previous: , Up: Configure and cleanup   [Contents][Index]

1.2.5 Using C++14 code

Support for a C++14 has been explicitly added to R from version 3.4.0. Similar considerations to C++11 apply, except that the variables associated with the C++14 compiler use the prefix ‘CXX14’ instead of ‘CXX11’. Hence to use C++14 code in a package, the package’s Makevars file (or Makevars.win on Windows) should include the line

CXX_STD = CXX14

In the absence of a Makevars file, C++14 support can also be requested by the line:

SystemRequirements: C++14

in the DESCRIPTION file. Finally, the C++14 compiler can be used systematically by setting the environment variable USE_CXX14.

Note that code written for C++11 that emulates features of C++14 will not necessarily compile under a C++14 compiler43, since the emulation typically leads to a namespace clash. In order to ensure that the code also compiles under C++14, something like the following should be done:

#if __cplusplus >= 201402L
using std::make_unique;
#else
// your emulation
#endif

Code needing C++14 features would do better to test for their presence via ‘SD-6 feature tests’44. That test could be

#include <memory> // header where this is defined
#if defined(__cpp_lib_make_unique) && (__cpp_lib_make_unique >= 201304)
using std::make_unique;
#else
// your emulation
#endif

The webpage http://en.cppreference.com/w/cpp/compiler_support gives some information on which compilers are known to support recent C++ features, including those in the C++17 drafts (for which feature tests should be used).


Previous: , Up: Configure and cleanup   [Contents][Index]

1.2.6 Using C++17 code

Experimental support for C++17 has been added to R version 3.4.0. The configure script tests a subset of C++17 features. At the time of writing (March 2017) both clang 4.0.0 and gcc 7.1 pass these tests (with flags -std=gnu++1z and -std=gnu++17 respectively chosen by the configure script). Note that the C++17 feature tests are incomplete and are subject to change in future R versions as compiler support for the standard improves.

The variables associated with the C++17 compiler use the prefix ‘CXX17’. Hence to use C++17 code in a package, the package’s Makevars file (or Makevars.win on Windows) should include the line

CXX_STD = CXX17

In the absence of a Makevars file, C++17 support can also be requested by the line:

SystemRequirements: C++17

in the DESCRIPTION file. Finally, the C++17 compiler can be used systematically by setting the environment variable USE_CXX17.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.3 Checking and building packages

Before using these tools, please check that your package can be installed (which checked it can be loaded). R CMD check will inter alia do this, but you may get more detailed error messages doing the install directly.

If your package specifies an encoding in its DESCRIPTION file, you should run these tools in a locale which makes use of that encoding: they may not work at all or may work incorrectly in other locales (although UTF-8 locales will most likely work).

Note: R CMD check and R CMD build run R processes with --vanilla in which none of the user’s startup files are read. If you need R_LIBS set (to find packages in a non-standard library) you can set it in the environment: also you can use the check and build environment files (as specified by the environment variables R_CHECK_ENVIRON and R_BUILD_ENVIRON; if unset, files45 ~/.R/check.Renviron and ~/.R/build.Renviron are used) to set environment variables when using these utilities.

Note to Windows users: R CMD build may make use of the Windows toolset (see the “R Installation and Administration” manual) if present and in your path, and it is required for packages which need it to install (including those with configure.win or cleanup.win scripts or a src directory) and e.g. need vignettes built.

You may need to set the environment variable TMPDIR to point to a suitable writable directory with a path not containing spaces – use forward slashes for the separators. Also, the directory needs to be on a case-honouring file system (some network-mounted file systems are not).


Next: , Previous: , Up: Checking and building packages   [Contents][Index]

1.3.1 Checking packages

Using R CMD check, the R package checker, one can test whether source R packages work correctly. It can be run on one or more directories, or compressed package tar archives with extension .tar.gz, .tgz, .tar.bz2 or .tar.xz.

It is strongly recommended that the final checks are run on a tar archive prepared by R CMD build.

This runs a series of checks, including

  1. The package is installed. This will warn about missing cross-references and duplicate aliases in help files.
  2. The file names are checked to be valid across file systems and supported operating system platforms.
  3. The files and directories are checked for sufficient permissions (Unix-alikes only).
  4. The files are checked for binary executables, using a suitable version of file if available46. (There may be rare false positives.)
  5. The DESCRIPTION file is checked for completeness, and some of its entries for correctness. Unless installation tests are skipped, checking is aborted if the package dependencies cannot be resolved at run time. (You may need to set R_LIBS in the environment if dependent packages are in a separate library tree.) One check is that the package name is not that of a standard package, nor one of the defunct standard packages (‘ctest’, ‘eda’, ‘lqs’, ‘mle’, ‘modreg’, ‘mva’, ‘nls’, ‘stepfun’ and ‘ts’). Another check is that all packages mentioned in library or requires or from which the NAMESPACE file imports or are called via :: or ::: are listed (in ‘Depends’, ‘Imports’, ‘Suggests’): this is not an exhaustive check of the actual imports.
  6. Available index information (in particular, for demos and vignettes) is checked for completeness.
  7. The package subdirectories are checked for suitable file names and for not being empty. The checks on file names are controlled by the option --check-subdirs=value. This defaults to ‘default’, which runs the checks only if checking a tarball: the default can be overridden by specifying the value as ‘yes’ or ‘no’. Further, the check on the src directory is only run if the package does not contain a configure script (which corresponds to the value ‘yes-maybe’) and there is no src/Makefile or src/Makefile.in.

    To allow a configure script to generate suitable files, files ending in ‘.in’ will be allowed in the R directory.

    A warning is given for directory names that look like R package check directories – many packages have been submitted to CRAN containing these.

  8. The R files are checked for syntax errors. Bytes which are non-ASCII are reported as warnings, but these should be regarded as errors unless it is known that the package will always be used in the same locale.
  9. It is checked that the package can be loaded, first with the usual default packages and then only with package base already loaded. It is checked that the namespace this can be loaded in an empty session with only the base namespace loaded. (Namespaces and packages can be loaded very early in the session, before the default packages are available, so packages should work then.)
  10. The R files are checked for correct calls to library.dynam. Package startup functions are checked for correct argument lists and (incorrect) calls to functions which modify the search path or inappropriately generate messages. The R code is checked for possible problems using codetools. In addition, it is checked whether S3 methods have all arguments of the corresponding generic, and whether the final argument of replacement functions is called ‘value’. All foreign function calls (.C, .Fortran, .Call and .External calls) are tested to see if they have a PACKAGE argument, and if not, whether the appropriate DLL might be deduced from the namespace of the package. Any other calls are reported. (The check is generous, and users may want to supplement this by examining the output of tools::checkFF("mypkg", verbose=TRUE), especially if the intention were to always use a PACKAGE argument)
  11. The Rd files are checked for correct syntax and metadata, including the presence of the mandatory fields (\name, \alias, \title and \description). The Rd name and title are checked for being non-empty, and there is a check for missing cross-references (links).
  12. A check is made for missing documentation entries, such as undocumented user-level objects in the package.
  13. Documentation for functions, data sets, and S4 classes is checked for consistency with the corresponding code.
  14. It is checked whether all function arguments given in \usage sections of Rd files are documented in the corresponding \arguments section.
  15. The data directory is checked for non-ASCII characters and for the use of reasonable levels of compression.
  16. C, C++ and FORTRAN source and header files47 are tested for portable (LF-only) line endings. If there is a Makefile or Makefile.in or Makevars or Makevars.in file under the src directory, it is checked for portable line endings and the correct use of ‘$(BLAS_LIBS)’ and ‘$(LAPACK_LIBS)

    Compiled code is checked for symbols corresponding to functions which might terminate R or write to stdout/stderr instead of the console. Note that the latter might give false positives in that the symbols might be pulled in with external libraries and could never be called. Windows48 users should note that the Fortran and C++ runtime libraries are examples of such external libraries.

  17. Some checks are made of the contents of the inst/doc directory. These always include checking for files that look like leftovers, and if suitable tools (such as qpdf) are available, checking that the PDF documentation is of minimal size.
  18. The examples provided by the package’s documentation are run. (see Writing R documentation files, for information on using \examples to create executable example code.) If there is a file tests/Examples/pkg-Ex.Rout.save, the output of running the examples is compared to that file.

    Of course, released packages should be able to run at least their own examples. Each example is run in a ‘clean’ environment (so earlier examples cannot be assumed to have been run), and with the variables T and F redefined to generate an error unless they are set in the example: See Logical vectors in An Introduction to R.

  19. If the package sources contain a tests directory then the tests specified in that directory are run. (Typically they will consist of a set of .R source files and target output files .Rout.save.) Please note that the comparison will be done in the end user’s locale, so the target output files should be ASCII if at all possible. (The command line option --test-dir=foo may be used to specify tests in a non-standard location. For example, unusually slow tests could be placed in inst/slowTests and then R CMD check --test-dir=inst/slowTests would be used to run them. Other names that have been suggested are, for example, inst/testWithOracle for tests that require Oracle to be installed, inst/randomTests for tests which use random values and may occasionally fail by chance, etc.)
  20. The code in package vignettes (see Writing package vignettes) is executed, and the vignette PDFs re-made from their sources as a check of completeness of the sources (unless there is a ‘BuildVignettes’ field in the package’s DESCRIPTION file with a false value). If there is a target output file .Rout.save in the vignette source directory, the output from running the code in that vignette is compared with the target output file and any differences are reported (but not recorded in the log file). (If the vignette sources are in the deprecated location inst/doc, do mark such target output files to not be installed in .Rinstignore.)

    If there is an error49 in executing the R code in vignette foo.ext, a log file foo.ext.log is created in the check directory. The vignette PDFs are re-made in a copy of the package sources in the vign_test subdirectory of the check directory, so for further information on errors look in directory pkgname/vign_test/vignettes. (It is only retained if there are errors or if environment variable _R_CHECK_CLEAN_VIGN_TEST_ is set to a false value.)

  21. The PDF version of the package’s manual is created (to check that the Rd files can be converted successfully). This needs LaTeX and suitable fonts and LaTeX packages to be installed. See Making the manuals in R Installation and Administration.

All these tests are run with collation set to the C locale, and for the examples and tests with environment variable LANGUAGE=en: this is to minimize differences between platforms.

Use R CMD check --help to obtain more information about the usage of the R package checker. A subset of the checking steps can be selected by adding command-line options. It also allows customization by setting environment variables _R_CHECK_*_ as described in Tools in R Internals: a set of these customizations similar to those used by CRAN can be selected by the option --as-cran (which works best if Internet access is available). Some Windows users may need to set environment variable R_WIN_NO_JUNCTIONS to a non-empty value. The test of cyclic declarations50in DESCRIPTION files needs repositories (including CRAN) set: do this in ~/.Rprofile, by e.g.

options(repos = c(CRAN="https://cran.r-project.org"))

One check customization which can be revealing is

_R_CHECK_CODETOOLS_PROFILE_="suppressLocalUnused=FALSE"

which reports unused local assignments. Not only does this point out computations which are unnecessary because their results are unused, it also can uncover errors. (Two such are to intend to update an object by assigning a value but mistype its name or assign in the wrong scope, for example using <- where <<- was intended.) This can give false positives, most commonly because of non-standard evaluation for formulae and because the intention is to return objects in the environment of a function for later use.

Complete checking of a package which contains a file README.md needs pandoc installed: see http://johnmacfarlane.net/pandoc/installing.html. This should be reasonably current: at the time of writing CRAN used version 1.12.4.2 to process these files.

You do need to ensure that the package is checked in a suitable locale if it contains non-ASCII characters. Such packages are likely to fail some of the checks in a C locale, and R CMD check will warn if it spots the problem. You should be able to check any package in a UTF-8 locale (if one is available). Beware that although a C locale is rarely used at a console, it may be the default if logging in remotely or for batch jobs.

Multiple sub-architectures: On systems which support multiple sub-architectures (principally Windows), R CMD check will install and check a package which contains compiled code under all available sub-architectures. (Use option --force-multiarch to force this for packages without compiled code, which are otherwise only checked under the main sub-architecture.) This will run the loading tests, examples and tests directory under each installed sub-architecture in turn, and give an error if any fail. Where environment variables (including perhaps PATH) need to be set differently for each sub-architecture, these can be set in architecture-specific files such as R_HOME/etc/i386/Renviron.site.

An alternative approach is to use R CMD check --no-multiarch to check the primary sub-architecture, and then to use something like R --arch=x86_64 CMD check --extra-arch or (Windows) /path/to/R/bin/x64/Rcmd check --extra-arch to run for each additional sub-architecture just the checks51 which differ by sub-architecture. (This approach is required for packages which are installed by R CMD INSTALL --merge-multiarch.)

Where packages need additional commands to install all the sub-architectures these can be supplied by e.g. --install-args=--force-biarch.


Next: , Previous: , Up: Checking and building packages   [Contents][Index]

1.3.2 Building package tarballs

Packages may be distributed in source form as “tarballs” (.tar.gz files) or in binary form. The source form can be installed on all platforms with suitable tools and is the usual form for Unix-like systems; the binary form is platform-specific, and is the more common distribution form for the Windows and macOS platforms.

Using R CMD build, the R package builder, one can build R package tarballs from their sources (for example, for subsequent release).

Prior to actually building the package in the standard gzipped tar file format, a few diagnostic checks and cleanups are performed. In particular, it is tested whether object indices exist and can be assumed to be up-to-date, and C, C++ and FORTRAN source files and relevant makefiles in a src directory are tested and converted to LF line-endings if necessary.

Run-time checks whether the package works correctly should be performed using R CMD check prior to invoking the final build procedure.

To exclude files from being put into the package, one can specify a list of exclude patterns in file .Rbuildignore in the top-level source directory. These patterns should be Perl-like regular expressions (see the help for regexp in R for the precise details), one per line, to be matched case-insensitively against the file and directory names relative to the top-level package source directory. In addition, directories from source control systems52 or from eclipse53, directories with names ending .Rcheck or Old or old and files GNUMakefile54, Read-and-delete-me or with base names starting with ‘.#’, or starting and ending with ‘#’, or ending in ‘~’, ‘.bak’ or ‘.swp’, are excluded by default. In addition, those files in the R, demo and man directories which are flagged by R CMD check as having invalid names will be excluded.

Use R CMD build --help to obtain more information about the usage of the R package builder.

Unless R CMD build is invoked with the --no-build-vignettes option (or the package’s DESCRIPTION contains ‘BuildVignettes: no’ or similar), it will attempt to (re)build the vignettes (see Writing package vignettes) in the package. To do so it installs the current package into a temporary library tree, but any dependent packages need to be installed in an available library tree (see the Note: at the top of this section).

Similarly, if the .Rd documentation files contain any \Sexpr macros (see Dynamic pages), the package will be temporarily installed to execute them. Post-execution binary copies of those pages containing build-time macros will be saved in build/partial.rdb. If there are any install-time or render-time macros, a .pdf version of the package manual will be built and installed in the build subdirectory. (This allows CRAN or other repositories to display the manual even if they are unable to install the package.) This can be suppressed by the option --no-manual or if package’s DESCRIPTION contains ‘BuildManual: no’ or similar.

One of the checks that R CMD build runs is for empty source directories. These are in most (but not all) cases unintentional, if they are intentional use the option --keep-empty-dirs (or set the environment variable _R_BUILD_KEEP_EMPTY_DIRS_ to ‘TRUE’, or have a ‘BuildKeepEmpty’ field with a true value in the DESCRIPTION file).

The --resave-data option allows saved images (.rda and .RData files) in the data directory to be optimized for size. It will also compress tabular files and convert .R files to saved images. It can take values no, gzip (the default if this option is not supplied, which can be changed by setting the environment variable _R_BUILD_RESAVE_DATA_) and best (equivalent to giving it without a value), which chooses the most effective compression. Using best adds a dependence on R (>= 2.10) to the DESCRIPTION file if bzip2 or xz compression is selected for any of the files. If this is thought undesirable, --resave-data=gzip (which is the default if that option is not supplied) will do what compression it can with gzip. A package can control how its data is resaved by supplying a ‘BuildResaveData’ field (with one of the values given earlier in this paragraph) in its DESCRIPTION file.

The --compact-vignettes option will run tools::compactPDF over the PDF files in inst/doc (and its subdirectories) to losslessly compress them. This is not enabled by default (it can be selected by environment variable _R_BUILD_COMPACT_VIGNETTES_) and needs qpdf (http://qpdf.sourceforge.net/) to be available.

It can be useful to run R CMD check --check-subdirs=yes on the built tarball as a final check on the contents.

Where a non-POSIX file system is in use which does not utilize execute permissions, some care is needed with permissions. This applies on Windows and to e.g. FAT-formatted drives and SMB-mounted file systems on other OSes. The ‘mode’ of the file recorded in the tarball will be whatever file.info() returns. On Windows this will record only directories as having execute permission and on other OSes it is likely that all files have reported ‘mode’ 0777. A particular issue is packages being built on Windows which are intended to contain executable scripts such as configure and cleanup: R CMD build ensures those two are recorded with execute permission.

Directory build of the package sources is reserved for use by R CMD build: it contains information which may not easily be created when the package is installed, including index information on the vignettes and, rarely, information on the help pages and perhaps a copy of the PDF reference manual (see above).


Previous: , Up: Checking and building packages   [Contents][Index]

1.3.3 Building binary packages

Binary packages are compressed copies of installed versions of packages. They contain compiled shared libraries rather than C, C++ or Fortran source code, and the R functions are included in their installed form. The format and filename are platform-specific; for example, a binary package for Windows is usually supplied as a .zip file, and for the macOS platform the default binary package file extension is .tgz.

The recommended method of building binary packages is to use

R CMD INSTALL --build pkg where pkg is either the name of a source tarball (in the usual .tar.gz format) or the location of the directory of the package source to be built. This operates by first installing the package and then packing the installed binaries into the appropriate binary package file for the particular platform.

By default, R CMD INSTALL --build will attempt to install the package into the default library tree for the local installation of R. This has two implications:

To prevent changes to the present working installation or to provide an install location with write access, create a suitably located directory with write access and use the -l option to build the package in the chosen location. The usage is then

R CMD INSTALL -l location --build pkg

where location is the chosen directory with write access. The package will be installed as a subdirectory of location, and the package binary will be created in the current directory.

Other options for R CMD INSTALL can be found using R CMD INSTALL --help, and platform-specific details for special cases are discussed in the platform-specific FAQs.

Finally, at least one web-based service is available for building binary packages from (checked) source code: WinBuilder (see https://win-builder.R-project.org/) is able to build Windows binaries. Note that this is intended for developers on other platforms who do not have access to Windows but wish to provide binaries for the Windows platform.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.4 Writing package vignettes

In addition to the help files in Rd format, R packages allow the inclusion of documents in arbitrary other formats. The standard location for these is subdirectory inst/doc of a source package, the contents will be copied to subdirectory doc when the package is installed. Pointers from package help indices to the installed documents are automatically created. Documents in inst/doc can be in arbitrary format, however we strongly recommend providing them in PDF format, so users on almost all platforms can easily read them. To ensure that they can be accessed from a browser (as an HTML index is provided), the file names should start with an ASCII letter and be comprised entirely of ASCII letters or digits or hyphen or underscore.

A special case is package vignettes. Vignettes are documents in PDF or HTML format obtained from plain text literate source files from which R knows how to extract R code and create output (in PDF/HTML or intermediate LaTeX). Vignette engines do this work, using “tangle” and “weave” functions respectively. Sweave, provided by the R distribution, is the default engine. Since R version 3.0.0, other vignette engines besides Sweave are supported; see Non-Sweave vignettes.

Package vignettes have their sources in subdirectory vignettes of the package sources. Note that the location of the vignette sources only affects R CMD build and R CMD check: the tarball built by R CMD build includes in inst/doc the components intended to be installed.

Sweave vignette sources are normally given the file extension .Rnw or .Rtex, but for historical reasons extensions55 .Snw and .Stex are also recognized. Sweave allows the integration of LaTeX documents: see the Sweave help page in R and the Sweave vignette in package utils for details on the source document format.

Package vignettes are tested by R CMD check by executing all R code chunks they contain (except those marked for non-evaluation, e.g., with option eval=FALSE for Sweave). The R working directory for all vignette tests in R CMD check is a copy of the vignette source directory. Make sure all files needed to run the R code in the vignette (data sets, …) are accessible by either placing them in the inst/doc hierarchy of the source package or by using calls to system.file(). All other files needed to re-make the vignettes (such as LaTeX style files, BibTeX input files and files for any figures not created by running the code in the vignette) must be in the vignette source directory. R CMD check will check that vignette production has succeeded by comparing modification times of output files in inst/doc with the source in vignettes.

R CMD build will automatically56 create the (PDF or HTML versions of the) vignettes in inst/doc for distribution with the package sources. By including the vignette outputs in the package sources it is not necessary that these can be re-built at install time, i.e., the package author can use private R packages, screen snapshots and LaTeX extensions which are only available on his machine.57

By default R CMD build will run Sweave on all Sweave vignette source files in vignettes. If Makefile is found in the vignette source directory, then R CMD build will try to run make after the Sweave runs, otherwise texi2pdf is run on each .tex file produced.

The first target in the Makefile should take care of both creation of PDF/HTML files and cleaning up afterwards (including after Sweave), i.e., delete all files that shall not appear in the final package archive. Note that if the make step runs R it needs to be careful to respect the environment values of R_LIBS and R_HOME58. Finally, if there is a Makefile and it has a ‘clean:’ target, make clean is run.

All the usual caveats about including a Makefile apply. It must be portable (no GNU extensions), use LF line endings and must work correctly with a parallel make: too many authors have written things like

## BAD EXAMPLE
all: pdf clean

pdf: ABC-intro.pdf ABC-details.pdf

%.pdf:  %.tex
        texi2dvi --pdf $*

clean:
        rm *.tex ABC-details-*.pdf

which will start removing the source files whilst pdflatex is working.

Metadata lines can be placed in the source file, preferably in LaTeX comments in the preamble. One such is a \VignetteIndexEntry of the form

%\VignetteIndexEntry{Using Animal}

Others you may see are \VignettePackage (currently ignored), \VignetteDepends and \VignetteKeyword (which replaced \VignetteKeywords). These are processed at package installation time to create the saved data frame Meta/vignette.rds, but only the \VignetteIndexEntry and \VignetteKeyword statements are currently used. The \VignetteEngine statement is described in Non-Sweave vignettes.

At install time an HTML index for all vignettes in the package is automatically created from the \VignetteIndexEntry statements unless a file index.html exists in directory inst/doc. This index is linked from the HTML help index for the package. If you do supply a inst/doc/index.html file it should contain relative links only to files under the installed doc directory, or perhaps (not really an index) to HTML help files or to the DESCRIPTION file, and be valid HTML as confirmed via the W3C Markup Validation Service or Validator.nu.

Sweave/Stangle allows the document to specify the split=TRUE option to create a single R file for each code chunk: this will not work for vignettes where it is assumed that each vignette source generates a single file with the vignette extension replaced by .R.

Do watch that PDFs are not too large – one in a CRAN package was 72MB! This is usually caused by the inclusion of overly detailed figures, which will not render well in PDF viewers. Sometimes it is much better to generate fairly high resolution bitmap (PNG, JPEG) figures and include those in the PDF document.

When R CMD build builds the vignettes, it copies these and the vignette sources from directory vignettes to inst/doc. To install any other files from the vignettes directory, include a file vignettes/.install_extras which specifies these as Perl-like regular expressions on one or more lines. (See the description of the .Rinstignore file for full details.)


Next: , Previous: , Up: Writing package vignettes   [Contents][Index]

1.4.1 Encodings and vignettes

Vignettes will in general include descriptive text, R input, R output and figures, LaTeX include files and bibliographic references. As any of these may contain non-ASCII characters, the handling of encodings can become very complicated.

The vignette source file should be written in ASCII or contain a declaration of the encoding (see below). This applies even to comments within the source file, since vignette engines process comments to look for options and metadata lines. When an engine’s weave and tangle functions are called on the vignette source, it will be converted to the encoding of the current R session.

Stangle() will produce an R code file in the current locale’s encoding: for a non-ASCII vignette what that is is recorded in a comment at the top of the file.

Sweave() will produce a .tex file in the current encoding, or in UTF-8 if that is declared. Non-ASCII encodings need to be declared to LaTeX via a line like

\usepackage[utf8]{inputenc}

(It is also possible to use the more recent ‘inputenx’ LaTeX package.) For files where this line is not needed (e.g. chapters included within the body of a larger document, or non-Sweave vignettes), the encoding may be declared using a comment like

%\VignetteEncoding{UTF-8}

If the encoding is UTF-8, this can also be declared using the declaration

%\SweaveUTF8

If no declaration is given in the vignette, it will be assumed to be in the encoding declared for the package. If there is no encoding declared in either place, then it is an error to use non-ASCII characters in the vignette.

In any case, be aware that LaTeX may require the ‘usepackage’ declaration.

Sweave() will also parse and evaluate the R code in each chunk. The R output will also be in the current locale (or UTF-8 if so declared), and should be covered by the ‘inputenc’ declaration. One thing people often forget is that the R output may not be ASCII even for ASCII R sources, for many possible reasons. One common one is the use of ‘fancy’ quotes: see the R help on sQuote: note carefully that it is not portable to declare UTF-8 or CP1252 to cover such quotes, as their encoding will depend on the locale used to run Sweave(): this can be circumvented by setting options(useFancyQuotes="UTF-8") in the vignette.

The final issue is the encoding of figures – this applies only to PDF figures and not PNG etc. The PDF figures will contain declarations for their encoding, but the Sweave option pdf.encoding may need to be set appropriately: see the help for the pdf() graphics device.

As a real example of the complexities, consider the fortunes package version ‘1.4-0’. That package did not have a declared encoding, and its vignette was in ASCII. However, the data it displays are read from a UTF-8 CSV file and will be assumed to be in the current encoding, so fortunes.tex will be in UTF-8 in any locale. Had read.table been told the data were UTF-8, fortunes.tex would have been in the locale’s encoding.


Previous: , Up: Writing package vignettes   [Contents][Index]

1.4.2 Non-Sweave vignettes

Vignettes in formats other than Sweave are supported via “vignette engines”. For example knitr version 1.1 or later can create .tex files from a variation on Sweave format, and .html files from a variation on “markdown” format. These engines replace the Sweave() function with other functions to convert vignette source files into LaTeX files for processing into .pdf, or directly into .pdf or .html files. The Stangle() function is replaced with a function that extracts the R source from a vignette.

R recognizes non-Sweave vignettes using filename extensions specified by the engine. For example, the knitr package supports the extension .Rmd (standing for “R markdown”). The user indicates the vignette engine within the vignette source using a \VignetteEngine line, for example

%\VignetteEngine{knitr::knitr}

This specifies the name of a package and an engine to use in place of Sweave in processing the vignette. As Sweave is the only engine supplied with the R distribution, the package providing any other engine must be specified in the ‘VignetteBuilder’ field of the package DESCRIPTION file, and also specified in the ‘Suggests’, ‘Imports’ or ‘Depends’ field (since its namespace must be available to build or check your package). If more than one package is specified as a builder, they will be searched in the order given there. The utils package is always implicitly appended to the list of builder packages, but may be included earlier to change the search order.

Note that a package with non-Sweave vignettes should always have a ‘VignetteBuilder’ field in the DESCRIPTION file, since this is how R CMD check recognizes that there are vignettes to be checked: packages listed there are required when the package is checked.

The vignette engine can produce .tex, .pdf, or .html files as output. If it produces .tex files, R will call texi2pdf to convert them to .pdf for display to the user (unless there is a Makefile in the vignettes directory).

Package writers who would like to supply vignette engines need to register those engines in the package .onLoad function. For example, that function could make the call

tools::vignetteEngine("knitr", weave = vweave, tangle = vtangle,
                      pattern = "[.]Rmd$", package = "knitr")

(The actual registration in knitr is more complicated, because it supports other input formats.) See the ?tools::vignetteEngine help topic for details on engine registration.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.5 Package namespaces

R has a namespace management system for code in packages. This system allows the package writer to specify which variables in the package should be exported to make them available to package users, and which variables should be imported from other packages.

The namespace for a package is specified by the NAMESPACE file in the top level package directory. This file contains namespace directives describing the imports and exports of the namespace. Additional directives register any shared objects to be loaded and any S3-style methods that are provided. Note that although the file looks like R code (and often has R-style comments) it is not processed as R code. Only very simple conditional processing of if statements is implemented.

Packages are loaded and attached to the search path by calling library or require. Only the exported variables are placed in the attached frame. Loading a package that imports variables from other packages will cause these other packages to be loaded as well (unless they have already been loaded), but they will not be placed on the search path by these implicit loads. Thus code in the package can only depend on objects in its own namespace and its imports (including the base namespace) being visible59.

Namespaces are sealed once they are loaded. Sealing means that imports and exports cannot be changed and that internal variable bindings cannot be changed. Sealing allows a simpler implementation strategy for the namespace mechanism. Sealing also allows code analysis and compilation tools to accurately identify the definition corresponding to a global variable reference in a function body.

The namespace controls the search strategy for variables used by functions in the package. If not found locally, R searches the package namespace first, then the imports, then the base namespace and then the normal search path.


Next: , Previous: , Up: Package namespaces   [Contents][Index]

1.5.1 Specifying imports and exports

Exports are specified using the export directive in the NAMESPACE file. A directive of the form

export(f, g)

specifies that the variables f and g are to be exported. (Note that variable names may be quoted, and reserved words and non-standard names such as [<-.fractions must be.)

For packages with many variables to export it may be more convenient to specify the names to export with a regular expression using exportPattern. The directive

exportPattern("^[^\\.]")

exports all variables that do not start with a period. However, such broad patterns are not recommended for production code: it is better to list all exports or use narrowly-defined groups. (This pattern applies to S4 classes.) Beware of patterns which include names starting with a period: some of these are internal-only variables and should never be exported, e.g. ‘.__S3MethodsTable__.’ (and the code nowadays excludes known cases).

Packages implicitly import the base namespace. Variables exported from other packages with namespaces need to be imported explicitly using the directives import and importFrom. The import directive imports all exported variables from the specified package(s). Thus the directives

import(foo, bar)

specifies that all exported variables in the packages foo and bar are to be imported. If only some of the exported variables from a package are needed, then they can be imported using importFrom. The directive

importFrom(foo, f, g)

specifies that the exported variables f and g of the package foo are to be imported. Using importFrom selectively rather than import is good practice and recommended notably when importing from packages with more than a dozen exports.

To import every symbol from a package but for a few exceptions, pass the except argument to import. The directive

import(foo, except=c(bar, baz))

imports every symbol from foo except bar and baz. The value of except should evaluate to something coercible to a character vector, after substituting each symbol for its corresponding string.

It is possible to export variables from a namespace which it has imported from other namespaces: this has to be done explicitly and not via exportPattern.

If a package only needs a few objects from another package it can use a fully qualified variable reference in the code instead of a formal import. A fully qualified reference to the function f in package foo is of the form foo::f. This is slightly less efficient than a formal import and also loses the advantage of recording all dependencies in the NAMESPACE file (but they still need to be recorded in the DESCRIPTION file). Evaluating foo::f will cause package foo to be loaded, but not attached, if it was not loaded already—this can be an advantage in delaying the loading of a rarely used package.

Using foo:::f instead of foo::f allows access to unexported objects. This is generally not recommended, as the semantics of unexported objects may be changed by the package author in routine maintenance.


Next: , Previous: , Up: Package namespaces   [Contents][Index]

1.5.2 Registering S3 methods

The standard method for S3-style UseMethod dispatching might fail to locate methods defined in a package that is imported but not attached to the search path. To ensure that these methods are available the packages defining the methods should ensure that the generics are imported and register the methods using S3method directives. If a package defines a function print.foo intended to be used as a print method for class foo, then the directive

S3method(print, foo)

ensures that the method is registered and available for UseMethod dispatch, and the function print.foo does not need to be exported. Since the generic print is defined in base it does not need to be imported explicitly.

(Note that function and class names may be quoted, and reserved words and non-standard names such as [<- and function must be.)

It is possible to specify a third argument to S3method, the function to be used as the method, for example

S3method(print, check_so_symbols, .print.via.format)

when print.check_so_symbols is not needed.


Next: , Previous: , Up: Package namespaces   [Contents][Index]

1.5.3 Load hooks

There are a number of hooks called as packages are loaded, attached, detached, and unloaded. See help(".onLoad") for more details.

Since loading and attaching are distinct operations, separate hooks are provided for each. These hook functions are called .onLoad and .onAttach. They both take arguments60 libname and pkgname; they should be defined in the namespace but not exported.

Packages can use a .onDetach or .Last.lib function (provided the latter is exported from the namespace) when detach is called on the package. It is called with a single argument, the full path to the installed package. There is also a hook .onUnload which is called when the namespace is unloaded (via a call to unloadNamespace, perhaps called by detach(unload = TRUE)) with argument the full path to the installed package’s directory. .onUnload and .onDetach should be defined in the namespace and not exported, but .Last.lib does need to be exported.

Packages are not likely to need .onAttach (except perhaps for a start-up banner); code to set options and load shared objects should be placed in a .onLoad function, or use made of the useDynLib directive described next.

User-level hooks are also available: see the help on function setHook.

These hooks are often used incorrectly. People forget to export .Last.lib. Compiled code should be loaded in .onLoad (or via a useDynLb directive: see below) and unloaded in .onUnload. Do remember that a package’s namespace can be loaded without the namespace being attached (e.g. by pkgname::fun) and that a package can be detached and re-attached whilst its namespace remains loaded.


Next: , Previous: , Up: Package namespaces   [Contents][Index]

1.5.4 useDynLib

A NAMESPACE file can contain one or more useDynLib directives which allows shared objects that need to be loaded.61 The directive

useDynLib(foo)

registers the shared object foo62 for loading with library.dynam. Loading of registered object(s) occurs after the package code has been loaded and before running the load hook function. Packages that would only need a load hook function to load a shared object can use the useDynLib directive instead.

The useDynLib directive also accepts the names of the native routines that are to be used in R via the .C, .Call, .Fortran and .External interface functions. These are given as additional arguments to the directive, for example,

useDynLib(foo, myRoutine, myOtherRoutine)

By specifying these names in the useDynLib directive, the native symbols are resolved when the package is loaded and R variables identifying these symbols are added to the package’s namespace with these names. These can be used in the .C, .Call, .Fortran and .External calls in place of the name of the routine and the PACKAGE argument. For instance, we can call the routine myRoutine from R with the code

 .Call(myRoutine, x, y)

rather than

 .Call("myRoutine", x, y, PACKAGE = "foo")

There are at least two benefits to this approach. Firstly, the symbol lookup is done just once for each symbol rather than each time the routine is invoked. Secondly, this removes any ambiguity in resolving symbols that might be present in several compiled DLLs. However, this approach is nowadays deprecated in favour of supplying registration information (see below).

In some circumstances, there will already be an R variable in the package with the same name as a native symbol. For example, we may have an R function in the package named myRoutine. In this case, it is necessary to map the native symbol to a different R variable name. This can be done in the useDynLib directive by using named arguments. For instance, to map the native symbol name myRoutine to the R variable myRoutine_sym, we would use

useDynLib(foo, myRoutine_sym = myRoutine, myOtherRoutine)

We could then call that routine from R using the command

 .Call(myRoutine_sym, x, y)

Symbols without explicit names are assigned to the R variable with that name.

In some cases, it may be preferable not to create R variables in the package’s namespace that identify the native routines. It may be too costly to compute these for many routines when the package is loaded if many of these routines are not likely to be used. In this case, one can still perform the symbol resolution correctly using the DLL, but do this each time the routine is called. Given a reference to the DLL as an R variable, say dll, we can call the routine myRoutine using the expression

 .Call(dll$myRoutine, x, y)

The $ operator resolves the routine with the given name in the DLL using a call to getNativeSymbol. This is the same computation as above where we resolve the symbol when the package is loaded. The only difference is that this is done each time in the case of dll$myRoutine.

In order to use this dynamic approach (e.g., dll$myRoutine), one needs the reference to the DLL as an R variable in the package. The DLL can be assigned to a variable by using the variable = dllName format used above for mapping symbols to R variables. For example, if we wanted to assign the DLL reference for the DLL foo in the example above to the variable myDLL, we would use the following directive in the NAMESPACE file:

myDLL = useDynLib(foo, myRoutine_sym = myRoutine, myOtherRoutine)

Then, the R variable myDLL is in the package’s namespace and available for calls such as myDLL$dynRoutine to access routines that are not explicitly resolved at load time.

If the package has registration information (see Registering native routines), then we can use that directly rather than specifying the list of symbols again in the useDynLib directive in the NAMESPACE file. Each routine in the registration information is specified by giving a name by which the routine is to be specified along with the address of the routine and any information about the number and type of the parameters. Using the .registration argument of useDynLib, we can instruct the namespace mechanism to create R variables for these symbols. For example, suppose we have the following registration information for a DLL named myDLL:

static R_NativePrimitiveArgType foo_t[] = {
    REALSXP, INTSXP, STRSXP, LGLSXP
};

static const R_CMethodDef cMethods[] = {
   {"foo", (DL_FUNC) &foo, 4, foo_t},
   {"bar_sym", (DL_FUNC) &bar, 0},
   {NULL, NULL, 0, NULL}
};

static const R_CallMethodDef callMethods[] = {
   {"R_call_sym", (DL_FUNC) &R_call, 4},
   {"R_version_sym", (DL_FUNC) &R_version, 0},
   {NULL, NULL, 0}
};

Then, the directive in the NAMESPACE file

useDynLib(myDLL, .registration = TRUE)

causes the DLL to be loaded and also for the R variables foo, bar_sym, R_call_sym and R_version_sym to be defined in the package’s namespace.

Note that the names for the R variables are taken from the entry in the registration information and do not need to be the same as the name of the native routine. This allows the creator of the registration information to map the native symbols to non-conflicting variable names in R, e.g. R_version to R_version_sym for use in an R function such as

R_version <- function()
{
  .Call(R_version_sym)
}

Using argument .fixes allows an automatic prefix to be added to the registered symbols, which can be useful when working with an existing package. For example, package KernSmooth has

useDynLib(KernSmooth, .registration = TRUE, .fixes = "F_")

which makes the R variables corresponding to the FORTRAN symbols F_bkde and so on, and so avoid clashes with R code in the namespace.

NB: Using these arguments for a package which does not register native symbols merely slows down the package loading (although at the time of writing 90 CRAN packages did so). Once symbols are registered, check that the corresponding R variables are not accidentally exported by a pattern in the NAMESPACE file.


Next: , Previous: , Up: Package namespaces   [Contents][Index]

1.5.5 An example

As an example consider two packages named foo and bar. The R code for package foo in file foo.R is

x <- 1
f <- function(y) c(x,y)
foo <- function(x) .Call("foo", x, PACKAGE="foo")
print.foo <- function(x, ...) cat("<a foo>\n")

Some C code defines a C function compiled into DLL foo (with an appropriate extension). The NAMESPACE file for this package is

useDynLib(foo)
export(f, foo)
S3method(print, foo)

The second package bar has code file bar.R

c <- function(...) sum(...)
g <- function(y) f(c(y, 7))
h <- function(y) y+9

and NAMESPACE file

import(foo)
export(g, h)

Calling library(bar) loads bar and attaches its exports to the search path. Package foo is also loaded but not attached to the search path. A call to g produces

> g(6)
[1]  1 13

This is consistent with the definitions of c in the two settings: in bar the function c is defined to be equivalent to sum, but in foo the variable c refers to the standard function c in base.


Previous: , Up: Package namespaces   [Contents][Index]

1.5.6 Namespaces with S4 classes and methods

Some additional steps are needed for packages which make use of formal (S4-style) classes and methods (unless these are purely used internally). The package should have Depends: methods in its DESCRIPTION file63 and import(methods) or importFrom(methods, ...) plus any classes and methods which are to be exported need to be declared in the NAMESPACE file. For example, the stats4 package has

export(mle) # exporting methods implicitly exports the generic
importFrom("graphics", plot)
importFrom("stats", optim, qchisq)
## For these, we define methods or (AIC, BIC, nobs) an implicit generic:
importFrom("stats", AIC, BIC, coef, confint, logLik, nobs, profile,
           update, vcov)
exportClasses(mle, profile.mle, summary.mle)
## All methods for imported generics:
exportMethods(coef, confint, logLik, plot, profile, summary,
              show, update, vcov)
## implicit generics which do not have any methods here
export(AIC, BIC, nobs)

All S4 classes to be used outside the package need to be listed in an exportClasses directive. Alternatively, they can be specified using exportClassPattern64 in the same style as for exportPattern. To export methods for generics from other packages an exportMethods directive can be used.

Note that exporting methods on a generic in the namespace will also export the generic, and exporting a generic in the namespace will also export its methods. If the generic function is not local to this package, either because it was imported as a generic function or because the non-generic version has been made generic solely to add S4 methods to it (as for functions such as plot in the example above), it can be declared via either or both of export or exportMethods, but the latter is clearer (and is used in the stats4 example above). In particular, for primitive functions there is no generic function, so export would export the primitive, which makes no sense. On the other hand, if the generic is local to this package, it is more natural to export the function itself using export(), and this must be done if an implicit generic is created without setting any methods for it (as is the case for AIC in stats4).

A non-local generic function is only exported to ensure that calls to the function will dispatch the methods from this package (and that is not done or required when the methods are for primitive functions). For this reason, you do not need to document such implicitly created generic functions, and undoc in package tools will not report them.

If a package uses S4 classes and methods exported from another package, but does not import the entire namespace of the other package65, it needs to import the classes and methods explicitly, with directives

importClassesFrom(package, ...)
importMethodsFrom(package, ...)

listing the classes and functions with methods respectively. Suppose we had two small packages A and B with B using A. Then they could have NAMESPACE files

export(f1, ng1)
exportMethods("[")
exportClasses(c1)

and

importFrom(A, ng1)
importClassesFrom(A, c1)
importMethodsFrom(A, f1)
export(f4, f5)
exportMethods(f6, "[")
exportClasses(c1, c2)

respectively.

Note that importMethodsFrom will also import any generics defined in the namespace on those methods.

It is important if you export S4 methods that the corresponding generics are available. You may for example need to import plot from graphics to make visible a function to be converted into its implicit generic. But it is better practice to make use of the generics exported by stats4 as this enables multiple packages to unambiguously set methods on those generics.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.6 Writing portable packages

This section contains advice on writing packages to be used on multiple platforms or for distribution (for example to be submitted to a package repository such as CRAN).

Portable packages should have simple file names: use only alphanumeric ASCII characters and period (.), and avoid those names not allowed under Windows which are mentioned above.

Many of the graphics devices are platform-specific: even X11() (aka x11()) which although emulated on Windows may not be available on a Unix-alike (and is not the preferred screen device on OS X). It is rarely necessary for package code or examples to open a new device, but if essential,66 use dev.new().

Use R CMD build to make the release .tar.gz file.

R CMD check provides a basic set of checks, but often further problems emerge when people try to install and use packages submitted to CRAN – many of these involve compiled code. Here are some further checks that you can do to make your package more portable.

Do be careful in what your tests (and examples) actually test. Bad practice seen in distributed packages include:


Next: , Previous: , Up: Writing portable packages   [Contents][Index]

1.6.1 PDF size

There are a several tools available to reduce the size of PDF files: often the size can be reduced substantially with no or minimal loss in quality. Not only do large files take up space: they can stress the PDF viewer and take many minutes to print (if they can be printed at all).

qpdf (http://qpdf.sourceforge.net/) can compress losslessly. It is fairly readily available (e.g. it has binaries for Windows and packages in Debian/Ubuntu/Fedora, and is installed as part of the CRAN macOS distribution of R). R CMD build has an option to run qpdf over PDF files under inst/doc and replace them if at least 10Kb and 10% is saved. The full path to the qpdf command can be supplied as environment variable R_QPDF (and is on the CRAN binary of R for macOS). It seems MiKTeX does not use PDF object compression and so qpdf can reduce considerably the files it outputs: MiKTeX can be overridden by code in the preamble of an Sweave or LaTeX file — see how this is done for the R reference manual at https://svn.r-project.org/R/trunk/doc/manual/refman.top.

Other tools can reduce the size of PDFs containing bitmap images at excessively high resolution. These are often best re-generated (for example Sweave defaults to 300 ppi, and 100–150 is more appropriate for a package manual). These tools include Adobe Acrobat (not Reader), Apple’s Preview84 and Ghostscript (which converts PDF to PDF by

ps2pdf options -dAutoRotatePages=/None in.pdf out.pdf

and suitable options might be

-dPDFSETTINGS=/ebook
-dPDFSETTINGS=/screen

; see http://www.ghostscript.com/doc/current/Ps2pdf.htm for more such and consider all the options for image downsampling). There have been examples in CRAN packages for which Ghostscript 9.06 and later produced much better reductions than 9.05 or earlier.

We come across occasionally large PDF files containing excessively complicated figures using PDF vector graphics: such figures are often best redesigned or failing that, output as PNG files.

Option --compact-vignettes to R CMD build defaults to value ‘qpdf’: use ‘both’ to try harder to reduce the size, provided you have Ghostscript available (see the help for tools::compactPDF).


Next: , Previous: , Up: Writing portable packages   [Contents][Index]

1.6.2 Check timing

There are several ways to find out where time is being spent in the check process. Start by setting the environment variable _R_CHECK_TIMINGS_ to ‘0’. This will report the total CPU times (not Windows) and elapsed times for installation and running examples, tests and vignettes, under each sub-architecture if appropriate. For tests and vignettes, it reports the time for each as well as the total.

Setting _R_CHECK_TIMINGS_ to a positive value sets a threshold (in seconds elapsed time) for reporting timings.

If you need to look in more detail at the timings for examples, use option --timings to R CMD check (this is set by --as-cran). This adds a summary to the check output for all the examples with CPU or elapsed time of more than 5 seconds. It produces a file mypkg.Rcheck/mypkg-Ex.timings containing timings for each help file: it is a tab-delimited file which can be read into R for further analysis.

Timings for the tests and vignette runs are given at the bottom of the corresponding log file: note that log files for successful vignette runs are only retained if environment variable _R_CHECK_ALWAYS_LOG_VIGNETTE_OUTPUT_ is set to a true value.


Next: , Previous: , Up: Writing portable packages   [Contents][Index]

1.6.3 Encoding issues

Care is needed if your package contains non-ASCII text, and in particular if it is intended to be used in more than one locale. It is possible to mark the encoding used in the DESCRIPTION file and in .Rd files, as discussed elsewhere in this manual.

First, consider carefully if you really need non-ASCII text. Many users of R will only be able to view correctly text in their native language group (e.g. Western European, Eastern European, Simplified Chinese) and ASCII.85. Other characters may not be rendered at all, rendered incorrectly, or cause your R code to give an error. For .Rd documentation, marking the encoding and including ASCII transliterations is likely to do a reasonable job. The set of characters which is commonly supported is wider than it used to be around 2000, but non-Latin alphabets (Greek, Russian, Georgian, …) are still often problematic and those with double-width characters (Chinese, Japanese, Korean) often need specialist fonts to render correctly.

Several CRAN packages have messages in their R code in French (and a few in German). A better way to tackle this is to use the internationalization facilities discussed elsewhere in this manual.

Function showNonASCIIfile in package tools can help in finding non-ASCII bytes in files.

There is a portable way to have arbitrary text in character strings (only) in your R code, which is to supply them in Unicode as \uxxxx escapes. If there are any characters not in the current encoding the parser will encode the character string as UTF-8 and mark it as such. This applies also to character strings in datasets: they can be prepared using \uxxxx escapes or encoded in UTF-8 in a UTF-8 locale, or even converted to UTF-8 via ‘iconv()’. If you do this, make sure you have ‘R (>= 2.10)’ (or later) in the ‘Depends’ field of the DESCRIPTION file.

R sessions running in non-UTF-8 locales will if possible re-encode such strings for display (and this is done by RGui on Windows, for example). Suitable fonts will need to be selected or made available86 both for the console/terminal and graphics devices such as ‘X11()’ and ‘windows()’. Using ‘postscript’ or ‘pdf’ will choose a default 8-bit encoding depending on the language of the UTF-8 locale, and your users would need to be told how to select the ‘encoding’ argument.

If you want to run R CMD check on a Unix-alike over a package that sets a package encoding in its DESCRIPTION file and do not use a UTF-8 locale you may need to specify a suitable locale via environment variable R_ENCODING_LOCALES. The default is equivalent to the value

"latin1=en_US:latin2=pl_PL:UTF-8=en_US.UTF-8:latin9=fr_FR.iso885915@euro"

(which is appropriate for a system based on glibc: macOS requires latin9=fr_FR.ISO8859-15) except that if the current locale is UTF-8 then the package code is translated to UTF-8 for syntax checking, so it is strongly recommended to check in a UTF-8 locale.


Next: , Previous: , Up: Writing portable packages   [Contents][Index]

1.6.4 Portable C and C++ code

Writing portable C and C++ code is mainly a matter of observing the standards (C99, C++98 or where declared C++11/14) and testing that extensions (such as POSIX functions) are supported.

Note that the ‘TR1’ C++ extensions are not part of any of these standards and the <tr1/name> headers are not supplied by some of the compilers used for R, including on macOS. (Use the C++11 versions instead.)

Note too that the POSIX standards only require recently-defined functions to be declared if certain macros are defined with large enough values, and on some compiler/OS combinations87 they are not declared otherwise. So you may need to include something like one of 88

#define _XOPEN_SOURCE 500

or

#ifdef __GLIBC__
# define _POSIX_C_SOURCE 200809L
#endif

before any headers. (strdup and strncasecmp are two such functions.)

However, some common errors are worth pointing out here. It can be helpful to look up functions at http://www.cplusplus.com/reference/ or http://en.cppreference.com/w/ and compare what is defined in the various standards.

Both the compiler and OS (via system header files, which may differ by architecture even for nominally the same OS) affect the compilability of C/C++ code. Compilers from the GCC, clang, Intel and Oracle Studio suites are routinely used with R, and both clang and Oracle have more than one implementation of C++ headers and library. The range of possibilities makes comprehensive empirical checking impossible, and regrettably compilers are patchy at best on warning about non-standard code.

Some additional information for C++ is available at http://journal.r-project.org/archive/2011-2/RJournal_2011-2_Plummer.pdf by Martyn Plummer.


Previous: , Up: Writing portable packages   [Contents][Index]

1.6.5 Binary distribution

If you want to distribute a binary version of a package on Windows or OS X, there are further checks you need to do to check it is portable: it is all too easy to depend on external software on your own machine that other users will not have.

For Windows, check what other DLLs your package’s DLL depends on (‘imports’ from in the DLL tools’ parlance). A convenient GUI-based tool to do so is ‘Dependency Walker’ (http://www.dependencywalker.com/) for both 32-bit and 64-bit DLLs – note that this will report as missing links to R’s own DLLs such as R.dll and Rblas.dll. For 32-bit DLLs only, the command-line tool pedump.exe -i (in Rtools*.exe) can be used, and for the brave, the objdump tool in the appropriate toolchain will also reveal what DLLs are imported from. If you use a toolchain other than one provided by the R developers or use your own makefiles, watch out in particular for dependencies on the toolchain’s runtime DLLs such as libgfortran, libstdc++ and libgcc_s.

For macOS, using R CMD otool -L on the package’s shared object(s) in the libs directory will show what they depend on: watch for any dependencies in /usr/local/lib or /usr/local/gfortran/lib, notably libgfortran.?.dylib and libquadmath.0.dylib.

Many people (including the CRAN package repository) will not accept source packages containing binary files as the latter are a security risk. If you want to distribute a source package which needs external software on Windows or macOS, options include

Be aware that license requirements will need to be met so you may need to supply the sources for the additional components (and will if your package has a GPL-like license).


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.7 Diagnostic messages

Diagnostic messages can be made available for translation, so it is important to write them in a consistent style. Using the tools described in the next section to extract all the messages can give a useful overview of your consistency (or lack of it). Some guidelines follow.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.8 Internationalization

There are mechanisms to translate the R- and C-level error and warning messages. There are only available if R is compiled with NLS support (which is requested by configure option --enable-nls, the default).

The procedures make use of msgfmt and xgettext which are part of GNU gettext and this will need to be installed: Windows users can find pre-compiled binaries at https://www.stats.ox.ac.uk/pub/Rtools/goodies/gettext-tools.zip.


Next: , Previous: , Up: Internationalization   [Contents][Index]

1.8.1 C-level messages

The process of enabling translations is

The file src/pkg.pot is the template file, and conventionally this is shipped as po/pkg.pot.


Next: , Previous: , Up: Internationalization   [Contents][Index]

1.8.2 R messages

Mechanisms are also available to support the automatic translation of R stop, warning and message messages. They make use of message catalogs in the same way as C-level messages, but using domain R-pkg rather than pkg. Translation of character strings inside stop, warning and message calls is automatically enabled, as well as other messages enclosed in calls to gettext or gettextf. (To suppress this, use argument domain=NA.)

Tools to prepare the R-pkg.pot file are provided in package tools: xgettext2pot will prepare a file from all strings occurring inside gettext/gettextf, stop, warning and message calls. Some of these are likely to be spurious and so the file is likely to need manual editing. xgettext extracts the actual calls and so is more useful when tidying up error messages.

The R function ngettext provides an interface to the C function of the same name: see example in the previous section. It is safest to use domain="R-pkg" explicitly in calls to ngettext, and necessary for earlier versions of R unless they are calls directly from a function in the package.


Previous: , Up: Internationalization   [Contents][Index]

1.8.3 Preparing translations

Once the template files have been created, translations can be made. Conventional translations have file extension .po and are placed in the po subdirectory of the package with a name that is either ‘ll.po’ or ‘R-ll.po’ for translations of the C and R messages respectively to language with code ‘ll’.

See Localization of messages in R Installation and Administration, for details of language codes.

There is an R function, update_pkg_po in package tools, to automate much of the maintenance of message translations. See its help for what it does in detail.

If this is called on a package with no existing translations, it creates the directory pkgdir/po, creates a template file of R messages, pkgdir/po/R-pkg.pot, within it, creates the ‘en@quot’ translation and installs that. (The ‘en@quot’ pseudo-language interprets quotes in their directional forms in suitable (e.g. UTF-8) locales.)

If the package has C source files in its src directory that are marked for translation, use

touch pkgdir/po/pkg.pot

to create a dummy template file, then call update_pkg_po again (this can also be done before it is called for the first time).

When translations to new languages are added in the pkgdir/po directory, running the same command will check and then install the translations.

If the package sources are updated, the same command will update the template files, merge the changes into the translation .po files and then installed the updated translations. You will often see that merging marks translations as ‘fuzzy’ and this is reported in the coverage statistics. As fuzzy translations are not used, this is an indication that the translation files need human attention.

The merged translations are run through tools::checkPofile to check that C-style formats are used correctly: if not the mismatches are reported and the broken translations are not installed.

This function needs the GNU gettext-tools installed and on the path: see its help page.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.9 CITATION files

An installed file named CITATION will be used by the citation() function. (It should be in the inst subdirectory of the package sources.)

The CITATION file is parsed as R code (in the package’s declared encoding, or in ASCII if none is declared). If no such file is present, citation auto-generates citation information from the package DESCRIPTION metadata, and an example of what that would look like as a CITATION file can be seen in recommended package nlme (see below): recommended packages boot, cluster and mgcv have further examples.

A CITATION file will contain calls to function bibentry.

Here is that for nlme:

year <- sub("-.*", "", meta$Date)
note <- sprintf("R package version %s", meta$Version)

bibentry(bibtype = "Manual",
         title = "{nlme}: Linear and Nonlinear Mixed Effects Models",
         author = c(person("Jose", "Pinheiro"),
                    person("Douglas", "Bates"),
                    person("Saikat", "DebRoy"),
                    person("Deepayan", "Sarkar"),
                    person("R Core Team")),
         year = year,
         note = note,
         url = "https://CRAN.R-project.org/package=nlme")

Note the way that information that may need to be updated is picked up from object meta, a parsed version of the DESCRIPTION file – it is tempting to hardcode such information, but it normally then gets outdated. See ?bibentry for further details of the information which can be provided.

In case a bibentry contains LaTeX markup (e.g., for accented characters or mathematical symbols), it may be necessary to provide a text representation to be used for printing via the textVersion argument to bibentry. E.g., earlier versions of nlme additionally used

         textVersion =
         paste0("Jose Pinheiro, Douglas Bates, Saikat DebRoy,",
                "Deepayan Sarkar and the R Core Team (",
                year,
                "). nlme: Linear and Nonlinear Mixed Effects Models. ",
                note, ".")

The CITATION file should itself produce no output when source-d.

It is desirable (and essential for CRAN) that the CITATION file does not contain calls to functions such as packageDescription which assume the package is installed in a library tree on the package search path.


Next: , Previous: , Up: Creating R packages   [Contents][Index]

1.10 Package types

The DESCRIPTION file has an optional field Type which if missing is assumed to be ‘Package’, the sort of extension discussed so far in this chapter. Currently one other type is recognized; there used also to be a ‘Translation’ type.


Previous: , Up: Package types   [Contents][Index]

1.10.1 Frontend

This is a rather general mechanism, designed for adding new front-ends such as the former gnomeGUI package (see the Archive area on CRAN). If a configure file is found in the top-level directory of the package it is executed, and then if a Makefile is found (often generated by configure), make is called. If R CMD INSTALL --clean is used make clean is called. No other action is taken.

R CMD build can package up this type of extension, but R CMD check will check the type and skip it.

Many packages of this type need write permission for the R installation directory.


Previous: , Up: Creating R packages   [Contents][Index]

1.11 Services

Several members of the R project have set up services to assist those writing R packages, particularly those intended for public distribution.

win-builder.r-project.org offers the automated preparation of (32/64-bit) Windows binaries from well-tested source packages.

R-Forge (R-Forge.r-project.org) and RForge (www.rforge.net) are similar services with similar names. Both provide source-code management through SVN, daily building and checking, mailing lists and a repository that can be accessed via install.packages (they can be selected by setRepositories and the GUI menus that use it). Package developers have the opportunity to present their work on the basis of project websites or news announcements. Mailing lists, forums or wikis provide useRs with convenient instruments for discussions and for exchanging information between developers and/or interested useRs.


Next: , Previous: , Up: Top   [Contents][Index]

2 Writing R documentation files


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.1 Rd format

R objects are documented in files written in “R documentation” (Rd) format, a simple markup language much of which closely resembles (La)TeX, which can be processed into a variety of formats, including LaTeX, HTML and plain text. The translation is carried out by functions in the tools package called by the script Rdconv in R_HOME/bin and by the installation scripts for packages.

The R distribution contains more than 1300 such files which can be found in the src/library/pkg/man directories of the R source tree, where pkg stands for one of the standard packages which are included in the R distribution.

As an example, let us look at a simplified version of src/library/base/man/load.Rd which documents the R function load.

% File src/library/base/man/load.Rd
\name{load}
\alias{load}
\title{Reload Saved Datasets}
\description{
  Reload the datasets written to a file with the function
  \code{save}.
}
\usage{
load(file, envir = parent.frame())
}
\arguments{
  \item{file}{a connection or a character string giving the
    name of the file to load.}
  \item{envir}{the environment where the data should be
    loaded.}
}
\seealso{
  \code{\link{save}}.
}
\examples{
## save all data
save(list = ls(), file= "all.RData")

## restore the saved values to the current environment
load("all.RData")

## restore the saved values to the workspace
load("all.RData", .GlobalEnv)
}
\keyword{file}

An Rd file consists of three parts. The header gives basic information about the name of the file, the topics documented, a title, a short textual description and R usage information for the objects documented. The body gives further information (for example, on the function’s arguments and return value, as in the above example). Finally, there is an optional footer with keyword information. The header is mandatory.

Information is given within a series of sections with standard names (and user-defined sections are also allowed). Unless otherwise specified92 these should occur only once in an Rd file (in any order), and the processing software will retain only the first occurrence of a standard section in the file, with a warning.

See “Guidelines for Rd files” for guidelines for writing documentation in Rd format which should be useful for package writers. The R generic function prompt is used to construct a bare-bones Rd file ready for manual editing. Methods are defined for documenting functions (which fill in the proper function and argument names) and data frames. There are also functions promptData, promptPackage, promptClass, and promptMethods for other types of Rd file.

The general syntax of Rd files is summarized below. For a detailed technical discussion of current Rd syntax, see “Parsing Rd files”.

Rd files consist of four types of text input. The most common is LaTeX-like, with the backslash used as a prefix on markup (e.g. \alias), and braces used to indicate arguments (e.g. {load}). The least common type of text is ‘verbatim’ text, where no markup other than the comment marker (%) is processed. There is also a rare variant of ‘verbatim’ text (used in \eqn, \deqn, \figure, and \newcommand) where comment markers need not be escaped. The final type is R-like, intended for R code, but allowing some embedded macros. Quoted strings within R-like text are handled specially: regular character escapes such as \n may be entered as-is. Only markup starting with \l (e.g. \link) or \v (e.g. \var) will be recognized within quoted strings. The rarely used vertical tab \v must be entered as \\v.

Each macro defines the input type for its argument. For example, the file initially uses LaTeX-like syntax, and this is also used in the \description section, but the \usage section uses R-like syntax, and the \alias macro uses ‘verbatim’ syntax. Comments run from a percent symbol % to the end of the line in all types of text except the rare ‘verbatim’ variant (as on the first line of the load example).

Because backslashes, braces and percent symbols have special meaning, to enter them into text sometimes requires escapes using a backslash. In general balanced braces do not need to be escaped, but percent symbols always do, except in the ‘verbatim’ variant. For the complete list of macros and rules for escapes, see “Parsing Rd files”.


Next: , Previous: , Up: Rd format   [Contents][Index]

2.1.1 Documenting functions

The basic markup commands used for documenting R objects (in particular, functions) are given in this subsection.

\name{name}

name typically93 is the basename of the Rd file containing the documentation. It is the “name” of the Rd object represented by the file and has to be unique in a package. To avoid problems with indexing the package manual, it may not contain ‘!’ ‘|’ nor ‘@’, and to avoid possible problems with the HTML help system it should not contain ‘/’ nor a space. (LaTeX special characters are allowed, but may not be collated correctly in the index.) There can only be one \name entry in a file, and it must not contain any markup. Entries in the package manual will be in alphabetic94 order of the \name entries.

\alias{topic}

The \alias sections specify all “topics” the file documents. This information is collected into index data bases for lookup by the on-line (plain text and HTML) help systems. The topic can contain spaces, but (for historical reasons) leading and trailing spaces will be stripped. Percent and left brace need to be escaped by a backslash.

There may be several \alias entries. Quite often it is convenient to document several R objects in one file. For example, file Normal.Rd documents the density, distribution function, quantile function and generation of random variates for the normal distribution, and hence starts with

\name{Normal}
\alias{Normal}
\alias{dnorm}
\alias{pnorm}
\alias{qnorm}
\alias{rnorm}

Also, it is often convenient to have several different ways to refer to an R object, and an \alias does not need to be the name of an object.

Note that the \name is not necessarily a topic documented, and if so desired it needs to have an explicit \alias entry (as in this example).

\title{Title}

Title information for the Rd file. This should be capitalized and not end in a period; try to limit its length to at most 65 characters for widest compatibility.

Markup is supported in the text, but use of characters other than English text and punctuation (e.g., ‘<’) may limit portability.

There must be one (and only one) \title section in a help file.

\description{…}

A short description of what the function(s) do(es) (one paragraph, a few lines only). (If a description is too long and cannot easily be shortened, the file probably tries to document too much at once.) This is mandatory except for package-overview files.

\usage{fun(arg1, arg2, …)}

One or more lines showing the synopsis of the function(s) and variables documented in the file. These are set in typewriter font. This is an R-like command.

The usage information specified should match the function definition exactly (such that automatic checking for consistency between code and documentation is possible).

It is no longer advisable to use \synopsis for the actual synopsis and show modified synopses in the \usage. Support for \synopsis will be removed in \R 3.1.0. To indicate that a function can be used in several different ways, depending on the named arguments specified, use section \details. E.g., abline.Rd contains

\details{
  Typical usages are
\preformatted{abline(a, b, untf = FALSE, \dots)
......
}

Use \method{generic}{class} to indicate the name of an S3 method for the generic function generic for objects inheriting from class "class". In the printed versions, this will come out as generic (reflecting the understanding that methods should not be invoked directly but via method dispatch), but codoc() and other QC tools always have access to the full name.

For example, print.ts.Rd contains

\usage{
\method{print}{ts}(x, calendar, \dots)
}

which will print as

Usage:

     ## S3 method for class ‘ts’:
     print(x, calendar, ...)

Usage for replacement functions should be given in the style of dim(x) <- value rather than explicitly indicating the name of the replacement function ("dim<-" in the above). Similarly, one can use \method{generic}{class}(arglist) <- value to indicate the usage of an S3 replacement method for the generic replacement function "generic<-" for objects inheriting from class "class".

Usage for S3 methods for extracting or replacing parts of an object, S3 methods for members of the Ops group, and S3 methods for user-defined (binary) infix operators (‘%xxx%’) follows the above rules, using the appropriate function names. E.g., Extract.factor.Rd contains

\usage{
\method{[}{factor}(x, \dots, drop = FALSE)
\method{[[}{factor}(x, \dots)
\method{[}{factor}(x, \dots) <- value
}

which will print as

Usage:

     ## S3 method for class ‘factor’:
     x[..., drop = FALSE]
     ## S3 method for class ‘factor’:
     x[[...]]
     ## S3 replacement method for class ‘factor’:
     x[...] <- value

\S3method is accepted as an alternative to \method.

\arguments{…}

Description of the function’s arguments, using an entry of the form

\item{arg_i}{Description of arg_i.}

for each element of the argument list. (Note that there is no whitespace between the three parts of the entry.) There may be optional text outside the \item entries, for example to give general information about groups of parameters.

\details{…}

A detailed if possible precise description of the functionality provided, extending the basic information in the \description slot.

\value{…}

Description of the function’s return value.

If a list with multiple values is returned, you can use entries of the form

\item{comp_i}{Description of comp_i.}

for each component of the list returned. Optional text may precede95 this list (see for example the help for rle). Note that \value is implicitly a \describe environment, so that environment should not be used for listing components, just individual \item{}{} entries.

\references{…}

A section with references to the literature. Use \url{} or \href{}{} for web pointers.

\note{...}

Use this for a special note you want to have pointed out. Multiple \note sections are allowed, but might be confusing to the end users.

For example, pie.Rd contains

\note{
  Pie charts are a very bad way of displaying information.
  The eye is good at judging linear measures and bad at
  judging relative areas.
  ......
}
\author{…}

Information about the author(s) of the Rd file. Use \email{} without extra delimiters (such as ‘( )’ or ‘< >’) to specify email addresses, or \url{} or \href{}{} for web pointers.

\seealso{…}

Pointers to related R objects, using \code{\link{...}} to refer to them (\code is the correct markup for R object names, and \link produces hyperlinks in output formats which support this. See Marking text, and Cross-references).

\examples{…}

Examples of how to use the function. Code in this section is set in typewriter font without reformatting and is run by example() unless marked otherwise (see below).

Examples are not only useful for documentation purposes, but also provide test code used for diagnostic checking of R code. By default, text inside \examples{} will be displayed in the output of the help page and run by example() and by R CMD check. You can use \dontrun{} for text that should only be shown, but not run, and \dontshow{} for extra commands for testing that should not be shown to users, but will be run by example(). (Previously this was called \testonly, and that is still accepted.)

Text inside \dontrun{} is ‘verbatim’, but the other parts of the \examples section are R-like text.

For example,

x <- runif(10)       # Shown and run.
\dontrun{plot(x)}    # Only shown.
\dontshow{log(x)}    # Only run.

Thus, example code not included in \dontrun must be executable! In addition, it should not use any system-specific features or require special facilities (such as Internet access or write permission to specific directories). Text included in \dontrun is indicated by comments in the processed help files: it need not be valid R code but the escapes must still be used for %, \ and unpaired braces as in other ‘verbatim’ text.

Example code must be capable of being run by example, which uses source. This means that it should not access stdin, e.g. to scan() data from the example file.

Data needed for making the examples executable can be obtained by random number generation (for example, x <- rnorm(100)), or by using standard data sets listed by data() (see ?data for more info).

Finally, there is \donttest, used (at the beginning of a separate line) to mark code that should be run by example() but not by R CMD check (by default: the option --run-donttest can be used). This should be needed only occasionally but can be used for code which might fail in circumstances that are hard to test for, for example in some locales. (Use e.g. capabilities() or nzchar(Sys.which("someprogram")) to test for features needed in the examples wherever possible, and you can also use try() or tryCatch(). Use interactive() to condition examples which need someone to interact with.) Note that code included in \donttest must be correct R code, and any packages used should be declared in the DESCRIPTION file. It is good practice to include a comment in the \donttest section explaining why it is needed.

As from R 3.4.0, output from code between comments

## IGNORE_RDIFF_BEGIN
## IGNORE_RDIFF_END

is ignored when comparing check output to reference output (a -Ex.Rout.save file).

\keyword{key}

There can be zero or more \keyword sections per file. Each \keyword section should specify a single keyword, preferably one of the standard keywords as listed in file KEYWORDS in the R documentation directory (default R_HOME/doc). Use e.g. RShowDoc("KEYWORDS") to inspect the standard keywords from within R. There can be more than one \keyword entry if the R object being documented falls into more than one category, or none.

Do strongly consider using \concept (see Indices) instead of \keyword if you are about to use more than very few non-standard keywords.

The special keyword ‘internal’ marks a page of internal objects that are not part of the package’s API. If the help page for object foo has keyword ‘internal’, then help(foo) gives this help page, but foo is excluded from several object indices, including the alphabetical list of objects in the HTML help system.

help.search() can search by keyword, including user-defined values: however the ‘Search Engine & Keywords’ HTML page accessed via help.start() provides single-click access only to a pre-defined list of keywords.


Next: , Previous: , Up: Rd format   [Contents][Index]

2.1.2 Documenting data sets

The structure of Rd files which document R data sets is slightly different. Sections such as \arguments and \value are not needed but the format and source of the data should be explained.

As an example, let us look at src/library/datasets/man/rivers.Rd which documents the standard R data set rivers.

\name{rivers}
\docType{data}
\alias{rivers}
\title{Lengths of Major North American Rivers}
\description{
  This data set gives the lengths (in miles) of 141 \dQuote{major}
  rivers in North America, as compiled by the US Geological
  Survey.
}
\usage{rivers}
\format{A vector containing 141 observations.}
\source{World Almanac and Book of Facts, 1975, page 406.}
\references{
  McNeil, D. R. (1977) \emph{Interactive Data Analysis}.
  New York: Wiley.
}
\keyword{datasets}

This uses the following additional markup commands.

\docType{…}

Indicates the “type” of the documentation object. Always ‘data’ for data sets, and ‘package’ for pkg-package.Rd overview files. Documentation for S4 methods and classes uses ‘methods’ (from promptMethods()) and ‘class’ (from promptClass()).

\format{…}

A description of the format of the data set (as a vector, matrix, data frame, time series, …). For matrices and data frames this should give a description of each column, preferably as a list or table. See Lists and tables, for more information.

\source{…}

Details of the original source (a reference or URL, see Specifying URLs). In addition, section \references could give secondary sources and usages.

Note also that when documenting data set bar,

If bar is a data frame, documenting it as a data set can be initiated via prompt(bar). Otherwise, the promptData function may be used.


Next: , Previous: , Up: Rd format   [Contents][Index]

2.1.3 Documenting S4 classes and methods

There are special ways to use the ‘?’ operator, namely ‘class?topic’ and ‘methods?topic’, to access documentation for S4 classes and methods, respectively. This mechanism depends on conventions for the topic names used in \alias entries. The topic names for S4 classes and methods respectively are of the form

class-class
generic,signature_list-method

where signature_list contains the names of the classes in the signature of the method (without quotes) separated by ‘,’ (without whitespace), with ‘ANY’ used for arguments without an explicit specification. E.g., ‘genericFunction-class’ is the topic name for documentation for the S4 class "genericFunction", and ‘coerce,ANY,NULL-method’ is the topic name for documentation for the S4 method for coerce for signature c("ANY", "NULL").

Skeletons of documentation for S4 classes and methods can be generated by using the functions promptClass() and promptMethods() from package methods. If it is necessary or desired to provide an explicit function declaration (in a \usage section) for an S4 method (e.g., if it has “surprising arguments” to be mentioned explicitly), one can use the special markup

\S4method{generic}{signature_list}(argument_list)

(e.g., ‘\S4method{coerce}{ANY,NULL}(from, to)’).

To make full use of the potential of the on-line documentation system, all user-visible S4 classes and methods in a package should at least have a suitable \alias entry in one of the package’s Rd files. If a package has methods for a function defined originally somewhere else, and does not change the underlying default method for the function, the package is responsible for documenting the methods it creates, but not for the function itself or the default method.

An S4 replacement method is documented in the same way as an S3 one: see the description of \method in Documenting functions.

See help("Documentation", package = "methods") for more information on using and creating on-line documentation for S4 classes and methods.


Previous: , Up: Rd format   [Contents][Index]

2.1.4 Documenting packages

Packages may have an overview help page with an \alias pkgname-package, e.g. ‘utils-package’ for the utils package, when package?pkgname will open that help page. If a topic named pkgname does not exist in another Rd file, it is helpful to use this as an additional \alias.

Skeletons of documentation for a package can be generated using the function promptPackage(). If the final = LIBS argument is used, then the Rd file will be generated in final form, containing the information that would be produced up to library(help = pkgname). Otherwise (the default) comments will be inserted giving suggestions for content.

Apart from the mandatory \name and \title and the pkgname-package alias, the only requirement for the package overview page is that it include a \docType{package} statement. All other content is optional. We suggest that it should be a short overview, to give a reader unfamiliar with the package enough information to get started. More extensive documentation is better placed into a package vignette (see Writing package vignettes) and referenced from this page, or into individual man pages for the functions, datasets, or classes.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.2 Sectioning

To begin a new paragraph or leave a blank line in an example, just insert an empty line (as in (La)TeX). To break a line, use \cr.

In addition to the predefined sections (such as \description{}, \value{}, etc.), you can “define” arbitrary ones by \section{section_title}{…}. For example

\section{Warning}{
  You must not call this function unless …
}

For consistency with the pre-assigned sections, the section name (the first argument to \section) should be capitalized (but not all upper case). Whitespace between the first and second braced expressions is not allowed. Markup (e.g. \code) within the section title may cause problems with the latex conversion (depending on the version of macro packages such as ‘hyperref’) and so should be avoided.

The \subsection macro takes arguments in the same format as \section, but is used within a section, so it may be used to nest subsections within sections or other subsections. There is no predefined limit on the nesting level, but formatting is not designed for more than 3 levels (i.e. subsections within subsections within sections).

Note that additional named sections are always inserted at a fixed position in the output (before \note, \seealso and the examples), no matter where they appear in the input (but in the same order amongst themselves as in the input).


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.3 Marking text

The following logical markup commands are available for emphasizing or quoting text.

\emph{text}
\strong{text}

Emphasize text using italic and bold font if possible; \strong is regarded as stronger (more emphatic).

\bold{text}

Set text in bold font where possible.

\sQuote{text}
\dQuote{text}

Portably single or double quote text (without hard-wiring the characters used for quotation marks).

Each of the above commands takes LaTeX-like input, so other macros may be used within text.

The following logical markup commands are available for indicating specific kinds of text. Except as noted, these take ‘verbatim’ text input, and so other macros may not be used within them. Some characters will need to be escaped (see Insertions).

\code{text}

Indicate text that is a literal example of a piece of an R program, e.g., a fragment of R code or the name of an R object. Text is entered in R-like syntax, and displayed using typewriter font where possible. Macros \var and \link are interpreted within text.

\preformatted{text}

Indicate text that is a literal example of a piece of a program. Text is displayed using typewriter font where possible. Formatting, e.g. line breaks, is preserved. (Note that this includes a line break after the initial {, so typically text should start on the same line as the command.)

Due to limitations in LaTeX as of this writing, this macro may not be nested within other markup macros other than \dQuote and \sQuote, as errors or bad formatting may result.

\kbd{keyboard-characters}

Indicate keyboard input, using slanted typewriter font if possible, so users can distinguish the characters they are supposed to type from computer output. Text is entered ‘verbatim’.

\samp{text}

Indicate text that is a literal example of a sequence of characters, entered ‘verbatim’. No wrapping or reformatting will occur. Displayed using typewriter font where possible.

\verb{text}

Indicate text that is a literal example of a sequence of characters, with no interpretation of e.g. \var, but which will be included within word-wrapped text. Displayed using typewriter font if possible.

\pkg{package_name}

Indicate the name of an R package. LaTeX-like.

\file{file_name}

Indicate the name of a file. Text is LaTeX-like, so backslash needs to be escaped. Displayed using a distinct font where possible.

\email{email_address}

Indicate an electronic mail address. LaTeX-like, will be rendered as a hyperlink in HTML and PDF conversion. Displayed using typewriter font where possible.

\url{uniform_resource_locator}

Indicate a uniform resource locator (URL) for the World Wide Web. The argument is handled as ‘verbatim’ text (with percent and braces escaped by backslash), and rendered as a hyperlink in HTML and PDF conversion. Linefeeds are removed, and leading and trailing whitespace96 is removed. See Specifying URLs.

Displayed using typewriter font where possible.

\href{uniform_resource_locator}{text}

Indicate a hyperlink to the World Wide Web. The first argument is handled as ‘verbatim’ text (with percent and braces escaped by backslash) and is used as the URL in the hyperlink, with the second argument of LaTeX-like text displayed to the user. Linefeeds are removed from the first argument, and leading and trailing whitespace is removed.

Note that RFC3986-encoded URLs (e.g. using ‘\%28VS.85\%29’ in place of ‘(VS.85)’) may not work correctly in versions of R before 3.1.3 and are best avoided—use URLdecode() to decode them.

\var{metasyntactic_variable}

Indicate a metasyntactic variable. In some cases this will be rendered distinctly, e.g. in italic, but not in all97. LaTeX-like.

\env{environment_variable}

Indicate an environment variable. ‘Verbatim’. Displayed using typewriter font where possible

\option{option}

Indicate a command-line option. ‘Verbatim’. Displayed using typewriter font where possible.

\command{command_name}

Indicate the name of a command. LaTeX-like, so \var is interpreted. Displayed using typewriter font where possible.

\dfn{term}

Indicate the introductory or defining use of a term. LaTeX-like.

\cite{reference}

Indicate a reference without a direct cross-reference via \link (see Cross-references), such as the name of a book. LaTeX-like.

\acronym{acronym}

Indicate an acronym (an abbreviation written in all capital letters), such as GNU. LaTeX-like.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.4 Lists and tables

The \itemize and \enumerate commands take a single argument, within which there may be one or more \item commands. The text following each \item is formatted as one or more paragraphs, suitably indented and with the first paragraph marked with a bullet point (\itemize) or a number (\enumerate).

Note that unlike argument lists, \item in these formats is followed by a space and the text (not enclosed in braces). For example

  \enumerate{
    \item A database consists of one or more records, each with one or
    more named fields.
    \item Regular lines start with a non-whitespace character.
    \item Records are separated by one or more empty lines.
  }

\itemize and \enumerate commands may be nested.

The \describe command is similar to \itemize but allows initial labels to be specified. Each \item takes two arguments, the label and the body of the item, in exactly the same way as an argument or value \item. \describe commands are mapped to <DL> lists in HTML and \description lists in LaTeX.

The \tabular command takes two arguments. The first gives for each of the columns the required alignment (‘l’ for left-justification, ‘r’ for right-justification or ‘c’ for centring.) The second argument consists of an arbitrary number of lines separated by \cr, and with fields separated by \tab. For example:

  \tabular{rlll}{
    [,1] \tab Ozone   \tab numeric \tab Ozone (ppb)\cr
    [,2] \tab Solar.R \tab numeric \tab Solar R (lang)\cr
    [,3] \tab Wind    \tab numeric \tab Wind (mph)\cr
    [,4] \tab Temp    \tab numeric \tab Temperature (degrees F)\cr
    [,5] \tab Month   \tab numeric \tab Month (1--12)\cr
    [,6] \tab Day     \tab numeric \tab Day of month (1--31)
  }

There must be the same number of fields on each line as there are alignments in the first argument, and they must be non-empty (but can contain only spaces). (There is no whitespace between \tabular and the first argument, nor between the two arguments.)


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.5 Cross-references

The markup \link{foo} (usually in the combination \code{\link{foo}}) produces a hyperlink to the help for foo. Here foo is a topic, that is the argument of \alias markup in another Rd file (possibly in another package). Hyperlinks are supported in some of the formats to which Rd files are converted, for example HTML and PDF, but ignored in others, e.g. the text format.

One main usage of \link is in the \seealso section of the help page, see Rd format.

Note that whereas leading and trailing spaces are stripped when extracting a topic from a \alias, they are not stripped when looking up the topic of a \link.

You can specify a link to a different topic than its name by \link[=dest]{name} which links to topic dest with name name. This can be used to refer to the documentation for S3/4 classes, for example \code{"\link[=abc-class]{abc}"} would be a way to refer to the documentation of an S4 class "abc" defined in your package, and \code{"\link[=terms.object]{terms}"} to the S3 "terms" class (in package stats). To make these easy to read in the source file, \code{"\linkS4class{abc}"} expands to the form given above.

There are two other forms of optional argument specified as \link[pkg]{foo} and \link[pkg:bar]{foo} to link to the package pkg, to files foo.html and bar.html respectively. These are rarely needed, perhaps to refer to not-yet-installed packages (but there the HTML help system will resolve the link at run time) or in the normally undesirable event that more than one package offers help on a topic98 (in which case the present package has precedence so this is only needed to refer to other packages). They are currently only used in HTML help (and ignored for hyperlinks in LaTeX conversions of help pages), and link to the file rather than the topic (since there is no way to know which topics are in which files in an uninstalled package). The only reason to use these forms for base and recommended packages is to force a reference to a package that might be further down the search path. Because they have been frequently misused, the HTML help system looks for topic foo in package pkg if it does not find file foo.html.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.6 Mathematics

Mathematical formulae should be set beautifully for printed documentation yet we still want something useful for text and HTML online help. To this end, the two commands \eqn{latex}{ascii} and \deqn{latex}{ascii} are used. Whereas \eqn is used for “inline” formulae (corresponding to TeX’s $…$), \deqn gives “displayed equations” (as in LaTeX’s displaymath environment, or TeX’s $$…$$). Both arguments are treated as ‘verbatim’ text.

Both commands can also be used as \eqn{latexascii} (only one argument) which then is used for both latex and ascii. No whitespace is allowed between command and the first argument, nor between the first and second arguments.

The following example is from Poisson.Rd:

  \deqn{p(x) = \frac{\lambda^x e^{-\lambda}}{x!}}{%
        p(x) = \lambda^x exp(-\lambda)/x!}
  for \eqn{x = 0, 1, 2, \ldots}.

For text on-line help we get

    p(x) = lambda^x exp(-lambda)/x!

for x = 0, 1, 2, ....

Greek letters (both cases) will be rendered in HTML if preceded by a backslash, \dots and \ldots will be rendered as ellipses and \sqrt, \ge and \le as mathematical symbols.

Note that only basic LaTeX can be used, there being no provision to specify LaTeX style files such as the AMS extensions.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.7 Figures

To include figures in help pages, use the \figure markup. There are three forms.

The two commonly used simple forms are \figure{filename} and \figure{filename}{alternate text}. This will include a copy of the figure in either HTML or LaTeX output. In text output, the alternate text will be displayed instead. (When the second argument is omitted, the filename will be used.) Both the filename and the alternate text will be parsed verbatim, and should not include special characters that are significant in HTML or LaTeX.

The expert form is \figure{filename}{options: string}. (The word ‘options:’ must be typed exactly as shown and followed by at least one space.) In this form, the string is copied into the HTML img tag as attributes following the src attribute, or into the second argument of the \Figure macro in LaTeX, which by default is used as options to an \includegraphics call. As it is unlikely that any single string would suffice for both display modes, the expert form would normally be wrapped in conditionals. It is up to the author to make sure that legal HTML/LaTeX is used. For example, to include a logo in both HTML (using the simple form) and LaTeX (using the expert form), the following could be used:

\if{html}{\figure{Rlogo.svg}{options: width=100 alt="R logo"}}
\if{latex}{\figure{Rlogo.pdf}{options: width=0.5in}}

The files containing the figures should be stored in the directory man/figures. Files with extensions .jpg, .jpeg, .pdf, .png and .svg from that directory will be copied to the help/figures directory at install time. (Figures in PDF format will not display in most HTML browsers, but might be the best choice in reference manuals.) Specify the filename relative to man/figures in the \figure directive.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.8 Insertions

Use \R for the R system itself. Use \dots for the dots in function argument lists ‘’, and \ldots for ellipsis dots in ordinary text.99 These can be followed by {}, and should be unless followed by whitespace.

After an unescaped ‘%’, you can put your own comments regarding the help text. The rest of the line (but not the newline at the end) will be completely disregarded. Therefore, you can also use it to make part of the “help” invisible.

You can produce a backslash (‘\’) by escaping it by another backslash. (Note that \cr is used for generating line breaks.)

The “comment” character ‘%’ and unpaired braces100 almost always need to be escaped by ‘\’, and ‘\\’ can be used for backslash and needs to be when there are two or more adjacent backslashes. In R-like code quoted strings are handled slightly differently; see “Parsing Rd files” for details – in particular braces should not be escaped in quoted strings.

All of ‘% { } \’ should be escaped in LaTeX-like text.

Text which might need to be represented differently in different encodings should be marked by \enc, e.g. \enc{Jöreskog}{Joreskog} (with no whitespace between the braces) where the first argument will be used where encodings are allowed and the second should be ASCII (and is used for e.g. the text conversion in locales that cannot represent the encoded form). (This is intended to be used for individual words, not whole sentences or paragraphs.)


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.9 Indices

The \alias command (see Documenting functions) is used to specify the “topics” documented, which should include all R objects in a package such as functions and variables, data sets, and S4 classes and methods (see Documenting S4 classes and methods). The on-line help system searches the index data base consisting of all alias topics.

In addition, it is possible to provide “concept index entries” using \concept, which can be used for help.search() lookups. E.g., file cor.test.Rd in the standard package stats contains

\concept{Kendall correlation coefficient}
\concept{Pearson correlation coefficient}
\concept{Spearman correlation coefficient}

so that e.g. ??Spearman will succeed in finding the help page for the test for association between paired samples using Spearman’s rho.

(Note that help.search() only uses “sections” of documentation objects with no additional markup.)

If you want to cross reference such items from other help files via \link, you need to use \alias and not \concept.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.10 Platform-specific documentation

Sometimes the documentation needs to differ by platform. Currently two OS-specific options are available, ‘unix’ and ‘windows’, and lines in the help source file can be enclosed in

#ifdef OS
   ...
#endif

or

#ifndef OS
   ...
#endif

for OS-specific inclusion or exclusion. Such blocks should not be nested, and should be entirely within a block (that, is between the opening and closing brace of a section or item), or at top-level contain one or more complete sections.

If the differences between platforms are extensive or the R objects documented are only relevant to one platform, platform-specific Rd files can be put in a unix or windows subdirectory.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.11 Conditional text

Occasionally the best content for one output format is different from the best content for another. For this situation, the \if{format}{text} or \ifelse{format}{text}{alternate} markup is used. Here format is a comma separated list of formats in which the text should be rendered. The alternate will be rendered if the format does not match. Both text and alternate may be any sequence of text and markup.

Currently the following formats are recognized: example, html, latex and text. These select output for the corresponding targets. (Note that example refers to extracted example code rather than the displayed example in some other format.) Also accepted are TRUE (matching all formats) and FALSE (matching no formats). These could be the output of the \Sexpr macro (see Dynamic pages).

The \out{literal} macro would usually be used within the text part of \if{format}{text}. It causes the renderer to output the literal text exactly, with no attempt to escape special characters. For example, use the following to output the markup necessary to display the Greek letter in LaTeX or HTML, and the text string alpha in other formats:

\ifelse{latex}{\out{$\alpha$}}{\ifelse{html}{\out{&alpha;}}{alpha}}

Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.12 Dynamic pages

Two macros supporting dynamically generated man pages are \Sexpr and \RdOpts. These are modelled after Sweave, and are intended to contain executable R expressions in the Rd file.

The main argument to \Sexpr must be valid R code that can be executed. It may also take options in square brackets before the main argument. Depending on the options, the code may be executed at package build time, package install time, or man page rendering time.

The options follow the same format as in Sweave, but different options are supported. Currently the allowed options and their defaults are:

The \RdOpts macro is used to set new defaults for options to apply to following uses of \Sexpr.

For more details, see the online document “Parsing Rd files”.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.13 User-defined macros

The \newcommand and \renewcommand macros allow new macros to be defined within an Rd file. These are similar but not identical to the same-named LaTeX macros.

They each take two arguments which are parsed verbatim. The first is the name of the new macro including the initial backslash, and the second is the macro definition. As in LaTeX, \newcommand requires that the new macro not have been previously defined, whereas \renewcommand allows existing macros (including all built-in ones) to be replaced. (As from version 3.2.0, this test is disabled by default, but may be enabled by setting the environment variable _WARN_DUPLICATE_RD_MACROS_ to a true value.)

Also as in LaTeX, the new macro may be defined to take arguments, and numeric placeholders such as #1 are used in the macro definition. However, unlike LaTeX, the number of arguments is determined automatically from the highest placeholder number seen in the macro definition. For example, a macro definition containing #1 and #3 (but no other placeholders) will define a three argument macro (whose second argument will be ignored). As in LaTeX, at most 9 arguments may be defined. If the # character is followed by a non-digit it will have no special significance. All arguments to user-defined macros will be parsed as verbatim text, and simple text-substitution will be used to replace the place-holders, after which the replacement text will be parsed.

As of R version 3.2.0, a number of macros are defined in the file share/Rd/macros/system.Rd of the R source or home directory, and these will normally be available in all .Rd files. For example, that file contains the definition

\newcommand{\PR}{\Sexpr[results=rd]{tools:::Rd_expr_PR(#1)}}

which defines \PR to be a single argument macro; then code (typically used in the NEWS.Rd file) like

\PR{1234}

will expand to

\Sexpr[results=rd]{tools:::Rd_expr_PR(1234)}

when parsed.

Some macros that might be of general use are:

\CRANpkg{pkg}

A package on CRAN

\sspace

A single space (used after a period that does not end a sentence).

\doi{numbers}

A digital object identifier (DOI).

See the system.Rd file in share/Rd/macros for more details and macro definitions, including macros \packageTitle, \packageDescription, \packageAuthor, \packageMaintainer, \packageDESCRIPTION and \packageIndices.

Packages may also define their own common macros; these would be stored in an .Rd file in man/macros in the package source and will be installed into help/macros when the package is installed. A package may also use the macros from a different package by listing the other package in the ‘RdMacros’ field in the DESCRIPTION file.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.14 Encoding

Rd files are text files and so it is impossible to deduce the encoding they are written in unless ASCII: files with 8-bit characters could be UTF-8, Latin-1, Latin-9, KOI8-R, EUC-JP, etc. So an \encoding{} section must be used to specify the encoding if it is not ASCII. (The \encoding{} section must be on a line by itself, and in particular one containing no non-ASCII characters. The encoding declared in the DESCRIPTION file will be used if none is declared in the file.) The Rd files are converted to UTF-8 before parsing and so the preferred encoding for the files themselves is now UTF-8.

Wherever possible, avoid non-ASCII chars in Rd files, and even symbols such as ‘<’, ‘>’, ‘$’, ‘^’, ‘&’, ‘|’, ‘@’, ‘~’, and ‘*’ outside ‘verbatim’ environments (since they may disappear in fonts designed to render text). (Function showNonASCIIfile in package tools can help in finding non-ASCII bytes in the files.)

For convenience, encoding names ‘latin1’ and ‘latin2’ are always recognized: these and ‘UTF-8’ are likely to work fairly widely. However, this does not mean that all characters in UTF-8 will be recognized, and the coverage of non-Latin characters101 is fairly low. Using LaTeX inputenx (see ?Rd2pdf in R) will give greater coverage of UTF-8.

The \enc command (see Insertions) can be used to provide transliterations which will be used in conversions that do not support the declared encoding.

The LaTeX conversion converts the file to UTF-8 from the declared encoding, and includes a

\inputencoding{utf8}

command, and this needs to be matched by a suitable invocation of the \usepackage{inputenc} command. The R utility R CMD Rd2pdf looks at the converted code and includes the encodings used: it might for example use

\usepackage[utf8]{inputenc}

(Use of utf8 as an encoding requires LaTeX dated 2003/12/01 or later. Also, the use of Cyrillic characters in ‘UTF-8’ appears to also need ‘\usepackage[T2A]{fontenc}’, and R CMD Rd2pdf includes this conditionally on the file t2aenc.def being present and environment variable _R_CYRILLIC_TEX_ being set.)

Note that this mechanism works best with Latin letters: the coverage of UTF-8 in LaTeX is quite low.


Next: , Previous: , Up: Writing R documentation files   [Contents][Index]

2.15 Processing documentation files

There are several commands to process Rd files from the system command line.

Using R CMD Rdconv one can convert R documentation format to other formats, or extract the executable examples for run-time testing. The currently supported conversions are to plain text, HTML and LaTeX as well as extraction of the examples.

R CMD Rd2pdf generates PDF output from documentation in Rd files, which can be specified either explicitly or by the path to a directory with the sources of a package. In the latter case, a reference manual for all documented objects in the package is created, including the information in the DESCRIPTION files.

R CMD Sweave and R CMD Stangle process vignette-like documentation files (e.g. Sweave vignettes with extension ‘.Snw’ or ‘.Rnw’, or other non-Sweave vignettes). R CMD Stangle is used to extract the R code fragments.

The exact usage and a detailed list of available options for all of these commands can be obtained by running R CMD command --help, e.g., R CMD Rdconv --help. All available commands can be listed using R --help (or Rcmd --help under Windows).

All of these work under Windows. You may need to have installed the the tools to build packages from source as described in the “R Installation and Administration” manual, although typically all that is needed is a LaTeX installation.


Previous: , Up: Writing R documentation files   [Contents][Index]

2.16 Editing Rd files

It can be very helpful to prepare .Rd files using a editor which knows about their syntax and will highlight commands, indent to show the structure and detect mis-matched braces, and so on.

The system most commonly used for this is some version of Emacs (including XEmacs) with the ESS package (https://ESS.R-project.org/: it is often is installed with Emacs but may need to be loaded, or even installed, separately).

Another is the Eclipse IDE with the Stat-ET plugin (http://www.walware.de/goto/statet), and (on Windows only) Tinn-R (http://sourceforge.net/projects/tinn-r/).

People have also used LaTeX mode in a editor, as .Rd files are rather similar to LaTeX files.

Some R front-ends provide editing support for .Rd files, for example RStudio (https://rstudio.org/).


Next: , Previous: , Up: Top   [Contents][Index]

3 Tidying and profiling R code

R code which is worth preserving in a package and perhaps making available for others to use is worth documenting, tidying up and perhaps optimizing. The last two of these activities are the subject of this chapter.


Next: , Previous: , Up: Tidying and profiling R code   [Contents][Index]

3.1 Tidying R code

R treats function code loaded from packages and code entered by users differently. By default code entered by users has the source code stored internally, and when the function is listed, the original source is reproduced. Loading code from a package (by default) discards the source code, and the function listing is re-created from the parse tree of the function.

Normally keeping the source code is a good idea, and in particular it avoids comments being removed from the source. However, we can make use of the ability to re-create a function listing from its parse tree to produce a tidy version of the function, for example with consistent indentation and spaces around operators. If the original source does not follow the standard format this tidied version can be much easier to read.

We can subvert the keeping of source in two ways.

  1. The option keep.source can be set to FALSE before the code is loaded into R.
  2. The stored source code can be removed by calling the removeSource() function, for example by
    myfun <- removeSource(myfun)
    

In each case if we then list the function we will get the standard layout.

Suppose we have a file of functions myfuns.R that we want to tidy up. Create a file tidy.R containing

source("myfuns.R", keep.source = FALSE)
dump(ls(all = TRUE), file = "new.myfuns.R")

and run R with this as the source file, for example by R --vanilla < tidy.R or by pasting into an R session. Then the file new.myfuns.R will contain the functions in alphabetical order in the standard layout. Warning: comments in your functions will be lost.

The standard format provides a good starting point for further tidying. Although the deparsing cannot do so, we recommend the consistent use of the preferred assignment operator ‘<-’ (rather than ‘=’) for assignment. Many package authors use a version of Emacs (on a Unix-alike or Windows) to edit R code, using the ESS[S] mode of the ESS Emacs package. See R coding standards in R Internals for style options within the ESS[S] mode recommended for the source code of R itself.


Next: , Previous: , Up: Tidying and profiling R code   [Contents][Index]

3.2 Profiling R code for speed

It is possible to profile R code on Windows and most102 Unix-alike versions of R.

The command Rprof is used to control profiling, and its help page can be consulted for full details. Profiling works by recording at fixed intervals103 (by default every 20 msecs) which line in which R function is being used, and recording the results in a file (default Rprof.out in the working directory). Then the function summaryRprof or the command-line utility R CMD Rprof Rprof.out can be used to summarize the activity.

As an example, consider the following code (from Venables & Ripley, 2002, pp. 225–6).

library(MASS); library(boot)
storm.fm <- nls(Time ~ b*Viscosity/(Wt - c), stormer,
                start = c(b=30.401, c=2.2183))
st <- cbind(stormer, fit=fitted(storm.fm))
storm.bf <- function(rs, i) {
    st$Time <-  st$fit + rs[i]
    tmp <- nls(Time ~ (b * Viscosity)/(Wt - c), st,
               start = coef(storm.fm))
    tmp$m$getAllPars()
}
rs <- scale(resid(storm.fm), scale = FALSE) # remove the mean
Rprof("boot.out")
storm.boot <- boot(rs, storm.bf, R = 4999) # slow enough to profile
Rprof(NULL)

Having run this we can summarize the results by

R CMD Rprof boot.out

Each sample represents 0.02 seconds.
Total run time: 22.52 seconds.

Total seconds: time spent in function and callees.
Self seconds: time spent in function alone.

   %       total       %        self
 total    seconds     self    seconds    name
 100.0     25.22       0.2      0.04     "boot"
  99.8     25.18       0.6      0.16     "statistic"
  96.3     24.30       4.0      1.02     "nls"
  33.9      8.56       2.2      0.56     "<Anonymous>"
  32.4      8.18       1.4      0.36     "eval"
  31.8      8.02       1.4      0.34     ".Call"
  28.6      7.22       0.0      0.00     "eval.parent"
  28.5      7.18       0.3      0.08     "model.frame"
  28.1      7.10       3.5      0.88     "model.frame.default"
  17.4      4.38       0.7      0.18     "sapply"
  15.0      3.78       3.2      0.80     "nlsModel"
  12.5      3.16       1.8      0.46     "lapply"
  12.3      3.10       2.7      0.68     "assign"
 ...

   %        self        %      total
  self    seconds     total   seconds    name
   5.7      1.44       7.5      1.88     "inherits"
   4.0      1.02      96.3     24.30     "nls"
   3.6      0.92       3.6      0.92     "$"
   3.5      0.88      28.1      7.10     "model.frame.default"
   3.2      0.80      15.0      3.78     "nlsModel"
   2.8      0.70       9.8      2.46     "qr.coef"
   2.7      0.68      12.3      3.10     "assign"
   2.5      0.64       2.5      0.64     ".Fortran"
   2.5      0.62       7.1      1.80     "qr.default"
   2.2      0.56      33.9      8.56     "<Anonymous>"
   2.1      0.54       5.9      1.48     "unlist"
   2.1      0.52       7.9      2.00     "FUN"
  ...

This often produces surprising results and can be used to identify bottlenecks or pieces of R code that could benefit from being replaced by compiled code.

Two warnings: profiling does impose a small performance penalty, and the output files can be very large if long runs are profiled at the default sampling interval.

Profiling short runs can sometimes give misleading results. R from time to time performs garbage collection to reclaim unused memory, and this takes an appreciable amount of time which profiling will charge to whichever function happens to provoke it. It may be useful to compare profiling code immediately after a call to gc() with a profiling run without a preceding call to gc.

More detailed analysis of the output can be achieved by the tools in the CRAN packages proftools and profr: in particular these allow call graphs to be studied.


Next: , Previous: , Up: Tidying and profiling R code   [Contents][Index]

3.3 Profiling R code for memory use

Measuring memory use in R code is useful either when the code takes more memory than is conveniently available or when memory allocation and copying of objects is responsible for slow code. There are three ways to profile memory use over time in R code. All three require R to have been compiled with --enable-memory-profiling, which is not the default, but is currently used for the macOS and Windows binary distributions. All can be misleading, for different reasons.

In understanding the memory profiles it is useful to know a little more about R’s memory allocation. Looking at the results of gc() shows a division of memory into Vcells used to store the contents of vectors and Ncells used to store everything else, including all the administrative overhead for vectors such as type and length information. In fact the vector contents are divided into two pools. Memory for small vectors (by default 128 bytes or less) is obtained in large chunks and then parcelled out by R; memory for larger vectors is obtained directly from the operating system.

Some memory allocation is obvious in interpreted code, for example,

y <- x + 1

allocates memory for a new vector y. Other memory allocation is less obvious and occurs because R is forced to make good on its promise of ‘call-by-value’ argument passing. When an argument is passed to a function it is not immediately copied. Copying occurs (if necessary) only when the argument is modified. This can lead to surprising memory use. For example, in the ‘survey’ package we have

print.svycoxph <- function (x, ...)
{
    print(x$survey.design, varnames = FALSE, design.summaries = FALSE, ...)
    x$call <- x$printcall
    NextMethod()
}

It may not be obvious that the assignment to x$call will cause the entire object x to be copied. This copying to preserve the call-by-value illusion is usually done by the internal C function duplicate.

The main reason that memory-use profiling is difficult is garbage collection. Memory is allocated at well-defined times in an R program, but is freed whenever the garbage collector happens to run.


Next: , Previous: , Up: Profiling R code for memory use   [Contents][Index]

3.3.1 Memory statistics from Rprof

The sampling profiler Rprof described in the previous section can be given the option memory.profiling=TRUE. It then writes out the total R memory allocation in small vectors, large vectors, and cons cells or nodes at each sampling interval. It also writes out the number of calls to the internal function duplicate, which is called to copy R objects. summaryRprof provides summaries of this information. The main reason that this can be misleading is that the memory use is attributed to the function running at the end of the sampling interval. A second reason is that garbage collection can make the amount of memory in use decrease, so a function appears to use little memory. Running under gctorture helps with both problems: it slows down the code to effectively increase the sampling frequency and it makes each garbage collection release a smaller amount of memory. Changing the memory limits with mem.limits() may also be useful, to see how the code would run under different memory conditions.


Next: , Previous: , Up: Profiling R code for memory use   [Contents][Index]

3.3.2 Tracking memory allocations

The second method of memory profiling uses a memory-allocation profiler, Rprofmem(), which writes out a stack trace to an output file every time a large vector is allocated (with a user-specified threshold for ‘large’) or a new page of memory is allocated for the R heap. Summary functions for this output are still being designed.

Running the example from the previous section with

> Rprofmem("boot.memprof",threshold=1000)
> storm.boot <- boot(rs, storm.bf, R = 4999)
> Rprofmem(NULL)

shows that apart from some initial and final work in boot there are no vector allocations over 1000 bytes.


Previous: , Up: Profiling R code for memory use   [Contents][Index]

3.3.3 Tracing copies of an object

The third method of memory profiling involves tracing copies made of a specific (presumably large) R object. Calling tracemem on an object marks it so that a message is printed to standard output when the object is copied via duplicate or coercion to another type, or when a new object of the same size is created in arithmetic operations. The main reason that this can be misleading is that copying of subsets or components of an object is not tracked. It may be helpful to use tracemem on these components.

In the example above we can run tracemem on the data frame st

> tracemem(st)
[1] "<0x9abd5e0>"
> storm.boot <- boot(rs, storm.bf, R = 4)
memtrace[0x9abd5e0->0x92a6d08]: statistic boot
memtrace[0x92a6d08->0x92a6d80]: $<-.data.frame $<- statistic boot
memtrace[0x92a6d80->0x92a6df8]: $<-.data.frame $<- statistic boot
memtrace[0x9abd5e0->0x9271318]: statistic boot
memtrace[0x9271318->0x9271390]: $<-.data.frame $<- statistic boot
memtrace[0x9271390->0x9271408]: $<-.data.frame $<- statistic boot
memtrace[0x9abd5e0->0x914f558]: statistic boot
memtrace[0x914f558->0x914f5f8]: $<-.data.frame $<- statistic boot
memtrace[0x914f5f8->0x914f670]: $<-.data.frame $<- statistic boot
memtrace[0x9abd5e0->0x972cbf0]: statistic boot
memtrace[0x972cbf0->0x972cc68]: $<-.data.frame $<- statistic boot
memtrace[0x972cc68->0x972cd08]: $<-.data.frame $<- statistic boot
memtrace[0x9abd5e0->0x98ead98]: statistic boot
memtrace[0x98ead98->0x98eae10]: $<-.data.frame $<- statistic boot
memtrace[0x98eae10->0x98eae88]: $<-.data.frame $<- statistic boot

The object is duplicated fifteen times, three times for each of the R+1 calls to storm.bf. This is surprising, since none of the duplications happen inside nls. Stepping through storm.bf in the debugger shows that all three happen in the line

st$Time <- st$fit + rs[i]

Data frames are slower than matrices and this is an example of why. Using tracemem(st$Viscosity) does not reveal any additional copying.


Previous: , Up: Tidying and profiling R code   [Contents][Index]

3.4 Profiling compiled code

Profiling compiled code is highly system-specific, but this section contains some hints gleaned from various R users. Some methods need to be different for a compiled executable and for dynamic/shared libraries/objects as used by R packages. We know of no good way to profile DLLs on Windows.


Next: , Previous: , Up: Profiling compiled code   [Contents][Index]

3.4.1 Linux

Options include using sprof for a shared object, and oprofile (see http://oprofile.sourceforge.net/) and perf (see https://perf.wiki.kernel.org/index.php/Tutorial) for any executable or shared object.

3.4.1.1 sprof

You can select shared objects to be profiled with sprof by setting the environment variable LD_PROFILE. For example

% setenv LD_PROFILE /path/to/R_HOME/library/stats/libs/stats.so
R
... run the boot example
% sprof /path/to/R_HOME/library/stats/libs/stats.so \
  /var/tmp/path/to/R_HOME/library/stats/libs/stats.so.profile

Flat profile:

Each sample counts as 0.01 seconds.
  %   cumulative   self              self     total
 time   seconds   seconds    calls  us/call  us/call  name
 76.19      0.32     0.32        0     0.00           numeric_deriv
 16.67      0.39     0.07        0     0.00           nls_iter
  7.14      0.42     0.03        0     0.00           getListElement

rm /var/tmp/path/to/R_HOME/library/stats/libs/stats.so.profile
... to clean up ...

It is possible that root access is needed to create the directories used for the profile data.

3.4.1.2 oprofile and operf

The oprofile project has two modes of operation. In what is now called ‘legacy’ mode, it is uses a daemon to collect information on a process (see below). Since version 0.9.8 (August 2012), the preferred mode is to use operf, so we discuss that first. The modes differ in how the profiling data is collected: it is analysed by tools such as opreport and oppannote in both.

Here is an example on x86_64 Linux using R 3.0.2. File pvec.R contains the part of the examples from pvec in package parallel:

library(parallel)
N <- 1e6
dates <- sprintf('%04d-%02d-%02d', as.integer(2000+rnorm(N)),
                 as.integer(runif(N, 1, 12)), as.integer(runif(N, 1, 28)))
system.time(a <- as.POSIXct(dates, format = "%Y-%m-%d"))

with timings from the final step

   user  system elapsed
  0.371   0.237   0.612

R-level profiling by Rprof shows

                     self.time self.pct total.time total.pct
"strptime"                1.70    41.06       1.70     41.06
"as.POSIXct.POSIXlt"      1.40    33.82       1.42     34.30
"sprintf"                 0.74    17.87       0.98     23.67
...

so the conversion from character to POSIXlt takes most of the time.

This can be run under operf and analysed by

operf R -f pvec.R
opreport
opreport -l /path/to/R_HOME/bin/exec/R
opannotate --source /path/to/R_HOME/bin/exec/R
## And for the system time
opreport -l /lib64/libc.so.6

The first report shows where (which library etc) the time was spent:

CPU_CLK_UNHALT...|
  samples|      %|
------------------
   166761 99.9161 Rdev
        CPU_CLK_UNHALT...|
          samples|      %|
        ------------------
            70586 42.3276 no-vmlinux
            56963 34.1585 libc-2.16.so
            36922 22.1407 R
             1584  0.9499 stats.so
              624  0.3742 libm-2.16.so
...

The rest of the output is voluminous, and only extracts are shown below.

Most of the time within R is spent in

samples  %        image name symbol name
10397    28.5123  R           R_gc_internal
5683     15.5848  R           do_sprintf
3036      8.3258  R           do_asPOSIXct
2427      6.6557  R           do_strptime
2421      6.6392  R           Rf_mkCharLenCE
1480      4.0587  R           w_strptime_internal
1202      3.2963  R           Rf_qnorm5
1165      3.1948  R           unif_rand
675       1.8511  R           mktime0
617       1.6920  R           makelt
617       1.6920  R           validate_tm
584       1.6015  R           day_of_the_week
...

opannotate shows that 31% of the time in R is spent in memory.c, 21% in datetime.c and 7% in Rstrptime.h. The analysis for libc showed that calls to wcsftime dominated, so those calls were cached for R 3.0.3: the time spent in no-vmlinux (the kernel) was reduced dramatically.

On platforms which support it, call graphs can be produced by opcontrol --callgraph if collected via operf --callgraph.

The profiling data is by default stored in sub-directory oprofile_data of the current directory, which can be removed at the end of the session.

Another example, from sm version 2.2-5.4. The example for sm.variogram took a long time:

system.time(example(sm.variogram))
...
   user  system elapsed
  5.543   3.202   8.785

including a lot of system time. Profiling just the slow part, the second plot, showed

  samples|      %|
------------------
   381845 99.9885 R
        CPU_CLK_UNHALT...|
          samples|      %|
        ------------------
           187484 49.0995 sm.so
           169627 44.4230 no-vmlinux
            12636  3.3092 libgfortran.so.3.0.0
             6455  1.6905 R

so the system time was almost all in the Linux kernel. It is possible to dig deeper if you have a matching uncompressed kernel with debug symbols to specify via --vmlinux: we did not.

In ‘legacy’ mode oprofile works by running a daemon which collects information. The daemon must be started as root, e.g.

% su
% opcontrol --no-vmlinux
% (optional, some platforms) opcontrol --callgraph=5
% opcontrol --start
% exit

Then as a user

% R
... run the boot example
% opcontrol --dump
% opreport -l /path/to/R_HOME/library/stats/libs/stats.so
...
samples  %        symbol name
1623     75.5939  anonymous symbol from section .plt
349      16.2552  numeric_deriv
113       5.2632  nls_iter
62        2.8878  getListElement
% opreport -l /path/to/R_HOME/bin/exec/R
...
samples  %        symbol name
76052    11.9912  Rf_eval
54670     8.6198  Rf_findVarInFrame3
37814     5.9622  Rf_allocVector
31489     4.9649  Rf_duplicate
28221     4.4496  Rf_protect
26485     4.1759  Rf_cons
23650     3.7289  Rf_matchArgs
21088     3.3250  Rf_findFun
19995     3.1526  findVarLocInFrame
14871     2.3447  Rf_evalList
13794     2.1749  R_Newhashpjw
13522     2.1320  R_gc_internal
...

Shutting down the profiler and clearing the records needs to be done as root.


Next: , Previous: , Up: Profiling compiled code   [Contents][Index]

3.4.2 Solaris

On 64-bit (only) Solaris, the standard profiling tool gprof collects information from shared objects compiled with -pg.


Previous: , Up: Profiling compiled code   [Contents][Index]

3.4.3 macOS

Developers have recommended sample (or Sampler.app, which is a GUI version), Shark (in version of Xcode up to those for Snow Leopard), and Instruments (part of Xcode, see https://developer.apple.com/library/content/documentation/DeveloperTools/Conceptual/InstrumentsUserGuide/index.html).


Next: , Previous: , Up: Top   [Contents][Index]

4 Debugging

This chapter covers the debugging of R extensions, starting with the ways to get useful error information and moving on to how to deal with errors that crash R. For those who prefer other styles there are contributed packages such as debug on CRAN (described in an article in R-News 3/3). (There are notes from 2002 provided by Roger Peng at http://www.biostat.jhsph.edu/~rpeng/docs/R-debug-tools.pdf which provide complementary examples to those given here.)


Next: , Previous: , Up: Debugging   [Contents][Index]

4.1 Browsing

Most of the R-level debugging facilities are based around the built-in browser. This can be used directly by inserting a call to browser() into the code of a function (for example, using fix(my_function) ). When code execution reaches that point in the function, control returns to the R console with a special prompt. For example

> fix(summary.data.frame) ## insert browser() call after for() loop
> summary(women)
Called from: summary.data.frame(women)
Browse[1]> ls()
 [1] "digits" "i"      "lbs"    "lw"     "maxsum" "nm"     "nr"     "nv"
 [9] "object" "sms"    "z"
Browse[1]> maxsum
[1] 7
Browse[1]>
     height         weight
 Min.   :58.0   Min.   :115.0
 1st Qu.:61.5   1st Qu.:124.5
 Median :65.0   Median :135.0
 Mean   :65.0   Mean   :136.7
 3rd Qu.:68.5   3rd Qu.:148.0
 Max.   :72.0   Max.   :164.0
> rm(summary.data.frame)

At the browser prompt one can enter any R expression, so for example ls() lists the objects in the current frame, and entering the name of an object will104 print it. The following commands are also accepted

Errors in code executed at the browser prompt will normally return control to the browser prompt. Objects can be altered by assignment, and will keep their changed values when the browser is exited. Ift rally obtained ,om 2002 prn3gng desntrol toturn pck tings fro the browser prtorslude using &laph


Next: Checking memory accese, Previous: Browsing, Up: Debugging   [Contents][Index]

Debugging R cing

bll de>ls() laion is binonal, suo; packput sing bll de>ls()

Here is an exaings fromregt expre s

suoot ac)
st &l(13,12,11,14,14,11,13,11,12 4999)
fThpurec)
st &l(0,0,0,0,0,0,0,2,2 4999)
s is))
st <- cbuoot a,
fThpure 4999)
e">dino ve)
st &l(0, 5^(0:7) 4999)
glm(s is)~ 0+e">dino v,
fTmila =ndowom sp(prok=;    &qog%m-%d&quo

:e are le">Table eis diomponhaion es fed a: pon relhichlyions, starnged v ple">> bll de 3ad98opofmem(& are le">Table eis diomponhaion es fed a: pon relhichly3.0.0 ons, starnged vmprof&qs the.scale = F herglm.fit(xscaX, a =nY, wmari wm, sta inwta stawta m(N)), emon staemon ineffe">T=neffe">, fTmila =nfTmilaction, con=tion, com(N)), uage so an=te is(m>, &quge so a%m-%d&q ple">0) 1: glm(s is)~ 0 + e">dino v, fTmila =ndowom sp(prok =; &qog%m-%d&quo ...

Data frameshose ghe maer cacl un u(ons, starting s to tnerakesbelowe>, sess g at ful oat a meswine notes e is Rf_dit c"3"> work in ls()bll de>ls() sr ifding alltion (pard to the funcr of ctory, which ca>mem.esnorsn by setmode Q-pg bll , and ;- _dupln hning at ful owaiodet colleppen i ging compiled code, for ex (ples from nls
>> bll de
herquot;(R_632  nls, mctitrl201ack )
1:
lt;-y)~ a + e ~ x,      staode>(asta0.12345, e =     321 201ack t= ling)  ...

and b store relode out tnerakess the i calls to <.Cr.appx$call

> bll de
9:
gm(acif, x)
8erquot;(R_652  numeric_,any Rlaristacirho, use)
7:2552  nuDric_tionm[[3]],>

(asta0.12345, e =     321 )  ...
Oc r (optlyUsing bll de>ls() doesy beemory, and hich catore re odeS4third medisp Rf_es iling inexitCopen ind for The follo For example

xyde)
st &newofmem(&xylocmprof&qsx=eger(r20), y=eger(r20)uo

ribeds. the enviro(pkg) :e ari">Sy now calprof&suo; pa:S4nswuot;nv&qode> onsear (whiesa

ribesome inize(nged FALSE, :eSreign langhird me can sfuncgoc a isful elues cen) Calledo the intedisp Rf_efloop or fun ‘some inize’ ple">> bll de 2:esome inize(nged FALSE, 1: lewofmem(&xylocmprof&qsx = eger(r20), y = eger(r20)) ...

ory, wode> doesy beaemch 12)syles tn it eding call to Qls()for ws. ewshows wfro ful owaioying tipassi(

aere ilaimis usuopaqu. The exshere.)

ls()ls()
coolles( ful  = ol -nData fe4999)
glm(s is)~ 0 + e">dino v,
fTmila =ndowom sp(prok =;    &qog%m-%d&quo

:e are le">Table eis diomponhaion es fed a: pon relhichlyions, starnged v ...

ory, wjec12), the aion andnaram,e of an obsquo Cample
< in

 the deb(e
Mt a me:  

:e are le">Table eis diomponhaion es fed a: pon relhichlyions, starnged v Atly avail the enviroonhadcr of : 1: glm(s is)~ 0 + e">dino v, fTmila =ndowom sp(prok = ; &qog%m-%d&quoherglm.fit(xscaX, a =nY, wmari wm, sta inwta stawta maemo 3ad98opofmem(& are le">Table eis diomponhaion es fed a: pon relhichlyions, starnged v Ecan entg the environ the n,code0 codex obgw an sfun: ...

ory, whoseer12), the sng fre"Table of casUsing bll icateowaer ,above wefor s anone ibeode (dwtailathes ae hree h running atstem part of ful .bgw an sfTidyig the environor happro the broireason Data elowe>, seso cold to the funcr offorms whpawnloct o ful oat a me,c on,s Rfto snal, part of-and-varscolty,code ex into the funcr of ) time.

Broribed in the environ
ale r
of:
0.0glm.fit(xscaX, a =nY,      wmari     wm,      sta     inwta omen)
Called f the deb.lon.(;-  omen)
Browse[1]> ls()
 [1] &aicrintf"   fmem(&quue styrintf&quorintf&e eioldt;sms"    &ion, cot;sms"    &ionuot;nv&quo6)
 [1] &qevmprof"   fmem(&qev.s iiddigits" &qevoldt;sms"quot; &EMPTYuot;i"      &wtaot;nv&q[1()
 [1] &wta     rintf&quorintf&fTmilat;sms"quot; &fitmprof"   fmem(&f nomprof"       &quge so a%m-%d&q[16)
 [1] & nlsmprof"       &prokinvt;sms"    &murintf"    ;    &mu.wtaot;nv&;    ;    &mu.wta.nge%m-%d&q[2()
 [1] &emon   t;sms"    &nrintf"     ;    &nf noouot;lbs&quofmem(&
auot;lbs"quofmem(&
-ans%m-%d&q[26)
 [1] &effe">ot;nv&;    ;    &on   t;sms"  ;    &e    wtaot;nv&;  ;    &e    murintf"    &e nd-re"%m-%d&q[3()
 [1] &e nmurintf" rintf&wrintf"     ;    &     wmrintf"    &xrintf"     ;    &x

dino v,
fTmila =ndowom sp(prok = ;    &qog%m-%d&quoherglm.fit(xscaX, a =nY,      wmari     wm,      sta     inwta     stawta     
3ad98opofmem(&
are    le">Table eis diomponhaion es fed a: pon relhichlyions, starn

Ecan entg the environ the n,code0 codex obgw an sfun: :164.0
>cast.ol -=1 + 1
Be tusm mple It is poss se gfl o aRf_eusnew pagR. W to ds isrcionrrir c# And forol -needs tsavllec and hich cao be eio an lude udown tng co-next flag via
coolles( ful  = profe({ol -nData f(to.e priling); q()}))  ...
Sss g aty beaunction furo anocoolles line oturned The exsdata.

ls()
coolles( ful  = havevbre4999)
glm(s is)~ 0 + e">dino v,
fTmila =ndowom sp(prok = ;    &qog%m-%d&quo

:e are le">Table eis diomponhaion es fed a: pon relhichlyions, starnged v Ecan en Data n the n,code0 codex o 1: glm(s is)~ 0 + e">dino v, fTmila =ndowom sp(prok = ; &qog%m-%d&quoherglm.fit(xscaX, a =nY, wmari wm, sta inwta stawta w an sfun: ...

ory, wtputle esimilaracter to owaer ,above wes anone ris, tht, part ofan R prtused dir,n alere aco ->Tidyingre-lomisleathatol -nt Aofilsty bea new says,cmple and Qee wesiodkection n b
). If no ode>f;- _ movue boo direthes iron grtinlos a lde> t athes aerhree h runjmon mufto store ful nalty,aof no good wao hninmode is to use derof. T ## isrting a ca when the br uted at edebnebuggint in the functons, starinonquo;step-thsquo; mowe>penree, for ex wsamplldis t>< in

 the (glm.fite4999)
glm(s is)~ 0 + e">dino v,
fTmila =ndowom sp(prok =;    &qog%m-%d&quoevel debugpnerglm.fit(xscaX, a =nY,      wmari     wm,      sta     inwta     stawta     m(N)),emon   staemon   ineffe">T=neffe">,
fTmila =nfTmilaction, con=tion, com(N)),uage so an=te is(m>,
     &quge so a%m-%d&q ple">0)
evel : {xec/ode> listswh con
the funmen)

Browse[1]evel : xime(a <than x(xgram))uot;
Browse[1]     
(st)-2.235357e-06]evel : wtaime(a &dropox %*%]      omen)
Browse[1]wtalinux
        inux
        2inux
        3inux
        4inux
        5
l    0000e+00 -2.235357e-06 -1.117679e-05 -    8393e-05 -2.794197e-04linux
       6inux
        7inux
        8inux
        9
-1.397098e-03 -6.985492e-03 -3.492746e-02 -1.746373e-01quo)

Browse[1]evel : muime(a &prokinv(wtaime(a &wtai+neffe"> omen)
Browse[1]&ulinux
    inux
    2inux
    3inux
    4inux
    5inux
    6inux
    7inux
    8
1   000001584444478158444488815844444411584447206 584486039 584430389 584656755linux
   9
0.8397616q ...
(>

Thr pr use n to ano the func hninmodilselfstill beihe dedbsets p> contaeding call to browser()derofhiddes fary functlty,S3 Some metby ext,o use (libs/:::e">dino.Arimae>ls()S4 Some menaram,e on be ncumulap>Ifose gode> ony bea new fodes to derof de of a funcdef;-lleppen i n to ano the fun, ext,od to the funcmple ls() bes to derof onre erdof a functows wmple )onll step-thsteduntilo out tnerdof a funchaion es oef;-llation.n cct.es to derof on tnerdof a funcolty,s to use c c, Prela fose gcter to c l And for>The Most of f the secode>untilo outof a funcwserdm.esnorggo anwisrs % g red)sdata.

derof n the functode, for ex toe) ## inaeding call to browser()
> (glm.fit,n the brnalt=22)
Trellsptof a funcfmem(&flm.fit%m-%d& mnynuo; pac;    &on ts%m-%d&q[()
 [1] &flm.fit%m-%d&4999)
glm(s is)~ 0 + e">dino v,
fTmila =ndowom sp(prok =;    &qog%m-%d&quoTrellsptglm.fit(xscaX, a =nY,      wmari     wm,      sta     ilinuwta     stawta     maALSE. onll 22omen)
Called fenge(ny Rla the la tnlos omen)
Browse[1]nxec/lty, ebule-onll  ledoiven ase[1]untgt; (glm.fituot;))
cderof (,rdm.iling)ode>c).


Checking memory accing

Errorking memlis alormation IdChe/writ_ movueen i nrrthe amestle ginmuncr t is ablehat rscoext,,l, ghe detfornonal, maa maatrompf">C t ofthat ction asook a call a> on te lking memory a: mnynu an ular da ica whenal/Icnd etforms wRdilselfchaiomlis alompmood and bave e tioae curuted at the garb meswilan sfunc(de, se gutelanlstgarb me wilan sfuns call aed ,om 2e we n es oeleted)sdata.

Nopln hninking memory anwith ermoodprns es odle LAPACK, BLAS, OpenMP)Java-lude uuted pac:onal, uteleas ppart osrns emneeds tis">Co (optio/a>)oal, uror>Tlalloctouutsude ufrom charode iuot;.Fo.ere.)

cc and ls() and tolof oe mis Rf_bscoperhapsrting ing meleaktforam,efer t> and , ghe de.ere.)


es.ce iy beafuldet co ming mean varms time ed ,om 2nd uir >orks by ru garb meswilan sfuncaioyf">Ccaiot is posashe>. Ts ca mevroduced by c (descruncwlsty bea newere.) se wA so sme s b cify e mm>via< thelactsror tunmmedirompt. div>

Te pfre"Tedenter &lming mean tn sfunrough&rsmode imis"Bro Ce top-lr of caled by cc][Garb me Cilan sfuns 3/31ory, wif mis"Bro mliswme ed ,om 2eeds tgarb me-> if collelues wheyracts tandcworn b< Bam, obce on so y beaodle efer oking-me>Tlallocwith esdata.

Noll norma by run run under c and jmon n be prnaedhat nd uir >ibed in fan R pr, hopbe uormnlos calls to thualcr t i. Se ag at the from secode>h to how ciper otges shat rssdata.

It is possio, us ding alltary exa, tee> lanlectgnetplsfvevbrroduced by ode>c run under corslude ud inroolle /em> -pg.

the funcmple Tln reeportccTln reeport fme in ionfigurompiled with T -andetysMostee> l And for, as.ritysMoststswrmes bnrri"hcoe (en-varexiten addi func ee> lto y beadet co an tn s insu framesen-vare:nt in

    AingGCframese uogGCf./div> New noprseinonl no nopra newframese skstartes to Afan en GC no leo noprse hninhere arsMostypees to , Preftypeeectsthe rrex div> Mkess the lto mory aloop or funs c"3">, aniroon), lanl on), . ddR ex Moststsnopralty, antoldftypeeames

    iscribed insful at a men div>

d by sat iignop>rof bee is ted by crafer o hnncmple corsn by se the environ-and-varcmple <_R_CHECK_GCT_N_ode>cUsompiled af the debuent, ale mple ).


Iper if you hory anto the Lunca ginmuncCPUftypeeory gp opresn (in ves Mos.4.3 mwill104r ifThiss ted by chttp:/eftghisdojectccese Andt is possan varmsc
 saleak-c"3">=e
uo%m-%d& savaomllae)
st mypkg-Ex.R
 ...
orws wm in i">Tabltary exa, ext,od to ileecrealscrib m in odataode>cTh derable ng cor roolmizatpressionThis cawoprh c"3">Che> run ue ilaroolmizll iong co:amplemaxin norror informas toa thedernel with Debugebug sykages. Wetg as and b snal, nl no ing meleakt ples from < Idnextderof up to the end ofR s expreernExp cold mode i us dsed ar20x< ther o hnnc alere ed by ccCRANc some plat> ows wmple cc <,> ows wm-an< top-od-an< i c0, 1eory2atiLtop-l0 ecttt. the dertionwde> does dde an hrof. Ltop-l1> and det co nal, t ismwill
104parunsome inischa ing memionhaiolittartimpactornonp ed (iong ascrto top-ld=ssLtop-l2> and det co m andefer oking-mes tep">smwill104oram, oR. mch< ther olues a by run run ed by colty,cse g ther )sdata.

A is an exaablg by c

nuo; pacibe2006nt it.  The fnext
rr _duplsfevon Rnhaioan  needer Ida4tlitrseotes ea ing memddR ex evon
itwode> doesyyou hory anto.he>.  Ts  The faltered C call 01ack tode>
se
hows wfroisful eying t s. 

sode> de> (dp func end of ing methes ae hory a s.

Iinen i n bis kamlis alompling by colty,didinop) evt cold toan varmribed iunso#l/I/docesn thedepagR.mode. In For exad at l 01ack t aerenp-thsway)solalralty,fixed at ug 1ory, ww2)spenting cls() ). that l 01ack tject issuis diompire) or infowe l toroolle /em> c. e hst it-mopremlevel dn u(lin the derd by ][). ul>d by c -or so#=y fspataiss C autoinfocn-and-varsclty, al,mwill104

It is possio, us ding alltary exa, tee> lanlectgnetplsfvevbrroduced by ode>c run under c c <~//eftghisdrcspata co se mple

=e
uo
sasode>-or  so#=y f
 ...
ory eby setd in the environ-and-varcmple c .4.3 nr ifmoods isrcioensd e hninwith Debugebug sygamese pro tly avai (steg by cc <.sospata and mayturnnhows wuted packherewo#ten) Caibe> nrebsremlm I cbuoh 2)sylec/em> e f.mode#ten)de uuted pacrting d by c. ea funchaiode> (descrd inusm pars to $callc (describewlstdent/docasfun: g by c

g by c efconrnel nimilaraaim calls t meng mec"3">r >ibed by cI tly avainrnel nuir> 104pard by and anl o as s ablC++fcuo;roe (denelode outC++flibrbtyphaion es nter &lic maalomrough&r:nning atstem parwrit_ mo In sppl needesing Qs to ale mple coer-evbrfe fspata<10p>104 via cC 1&ndat ;2GBnalt, us dia elo> 64-bmdesome plat>i doe alis alo) 16&ndat ;20TBsMoshe hualcmeng m:r>Therictosegsy blsn by se Thi tusm an varms). iv>By iong aisunc ale mple c). iv>Reciron-(in ves n nd reebug socnmddR exss# And foris alormapart of ful hich prCample c<11p>104In ode> onpath:lodei

I tly avainficst i ode> onpatheoryhaion es rell<12p>104

on be ncumulap>Itouuipeed inre e ocstep-th l to c<13p>104lty,perhaps c

tion ling compC++fcuo;) mple ls() .4.3 ,nr ifmoods is io, us g by c

ory, w and ensd e hnintoremple c fromR code eaposash>owaer c and h"3"> This caen-vareLunca per-nuo; pacbdsisoorslude ua /em> <~//R/M -ansspata
(Nopln hninmple
<-fsaome z"=mddR exspe>cIn be ode>prokingnt it.srns by
se  
and  ma
 cah ns  altereuted pacrtry, wignos wm in
<~//R/M   -ansspata
to 
rok d ofathstem librbtiescode> fromR code eaposlodei
 ).
iv>ck. coolles ltly avaincify viacvia c).
iv>ck. ode (dwtailscribor informarmatie ful naRc002 prnrun> run ue
evel dn uiled afbreakthat pi">Tmufto storemddR ex saome z"hs
% oprcws
n be prd: fodes to < in

c<14p>104 s o be orsASANlectttninmple
ls() and < ew[]/oelete[]ode>cls().  infharodrnonal, ebsremscficst is
of:tsror tunmmedi
rnonal, linin seehows wiobode> doesinfhar,rebsremllibrbtiesmwill<15p>104here arscopenstompation.
h"3"> This ca gp R exslterero

ill bnter &l/em> c


/a>, Previous: , Up:   [Contents][Index]

Fodes to ls()c. Ts ctly avainos reciron-(in ves pard by and
 contnal, ming met
up to the end ofan a pr,lodeR
ilselfctime thet agaimstiASao, dingrun  
and yyou hn
ly d e sful eon tue
here (wmoodro

is a by runRedsep sMos thed runReilselfl). ul>eoo isavaint Che>lty, al, herictoC++ c"3">Che> t>< in

 th ASAN_OPTIONS ‘mlis
_w tois
_mis  Rf_=0:det co_leakt=0:det co_odr_, Pcument=0’

...
LSaocn so yaiom,nter &l the -al berough&requo;rhows wiobject mg comribelude dem>

nter &lUnoef;-ll bee w Prrrough&rsmodhows wfroieignunew sthe stabode> doesrequirl nu an ular bee w Prrpings frome ng corernExry exadro

is p> in ve ry z"hocherws wfodedouvarscRCRANCRAN &60559 bee w PrrpficsC/C++fo dsot)cts t>ablz"ho-lenged arrths,rehif> ltoon lrwfode egiedftypescoext,ys to lanl mis-ft" vmompatiNoeduncinmunctary exaTablout-of-rir c#coercpre time uted packherenfhamp> lto coercenans to c r egiedftypetages suaming ls()cc). ul>nter &lUBSaome z"hrough&rsmod> efconfodeC/C++fsPrrcwn the eso coompli em> <16p>104pard by wa, wuted parough&rsnDLLssioniobjecl ex Mo">Ccs is noao c

iscribed by c). ul>e>. aome z"hsThis cacinb;-ll odle tnotAddR ex Saome z"hpli em> ). ul>Fcoer ion, conefathes nd h"3">eer his caca mevroduc pfer ocoolles:ampled by <17p>104. the curi">Tfodes to < in

-evbrfe f,
float-p>
ide-by-z"ho,
the funcp as.r -p>
ide-by-z"ho,nonnull-at rifice,
null,ed ,om-o z",that "h-evbrfe f,n nd r,n nd rs-nonnull-at rifice,ehif>,
ingied-p as.r -evbrfe f,unrwa, -varn r egied-p as.r -evbrfe f,vla-quue ,vpthrame)
(plucttt. mto sn to sicy-(in ves ming -b reeport-exp beotderofc
ide-by-z"horame)
Ooolles g by < the funofile and tt. maim R code eaposluc thed runfroieifharernel nal, hrofiodkeample

Ooolle g by < loat-p>
ide-by-z"hoderofs to
ale R
ory, waliswmages p>
in vesedsep   sMos/RANCRAN &60559
andth).
ul>w e
h as r ied (in ve sioGCC,ewo#ten)
Caodecify via
h as nter &lPn R
pr
I #l/I/document Ooollesrough&r)e And forcoolles ngp opresnuc GCC:36 ode>7 ngp opr>< in

ide-by-z"ho,
nonnull-at rifice,null,ed ,om-o z",n nd r,n nd rs-nonnull-at rifice,
ehif>,ingied-p as.r -evbrfe f,unrwa, -varnvla-quue ,vpthrame)
plucttt. mto sn to sicy-(in ves ming -b reeport-exp beotderoflong the dercoolles>< in

orws wm by < loat-p>
ide-by-z"hoderofR,s t  onl
on), Cn ve sioon), c).
ul>Fngs GCC 8 ming c
and noook a"hplee> dehe de
p   sMos/em>   -pg.
is>< in

ory, wr t is tt. The f % oprcao cafatRf (iis owthe is# And fo on), and run ue evel dn uiled afbreakthat pi">Tmufto storesaome z"hs % oprcws n be prd: fodes to < in

oan  neitoturned odle
l to ).


c). ul>d by ode> onsPrrcw iles durebugling casfun: ess /a>][

nter &lD .bMing mrough&r ples f at http:/droing-mkject/sup>In ea ing mec"3">r And( the cuus),32-bmdeWrelowsna the Llty,a4.3 nrnel nimilaraaim caled by c<18p>104
Cheesing"> ChemacOS-1">

Che ccOS

Mkess end ofuot;.Fo e ng cor>In be ale Rwaliswm the ao cet mg com ale c"3">Che>parnrrth quue s:tode, for ex l to spata . ompt gowe a isful elues whecuppr ocr ther oquue cwserxc is nntext, mple

 ‘1’>pardtemCn ve 1>parnrrth ‘x’>abovecuppr oquue cpar0rame)
Onebode> d isrcio e awa e hninlazyfan R prmarodMo">Cc to syfuot;.Fo dtemCn ves utes to <1ode>crafer o hnncmple <*derof

Ieasyde inrrtn ica s to and h"3"> ve jmon frome drciber ie uted pa:s ddeedesem> <~//R/M -ansspatation l to

FFLAGS = sg -O2 -mtune=ncumul sfquue s-c"3">
 ...
orwber i, us g by ode>c). ul>e>. oood % oprcwnTors cpirewith erodle tnotgood hninuot;.Fo erom char -and-varscas wutexsl,ynu an ularireorwbeuot;.Fo assroutmaatcas wsquo C liom Cgcuo;e we>. inodro

is d inusm parBLASclty,LAPACK assroutmaatcim Rssioniobjecdoes dvisavainto, thedeR ilselfcting quue s c"3">Che olty,aoodsmaycse g not is poss2)syle to ssroutmaatcas wsquo C durebug fo R, thedl).


/a>, Previous: Checking memory accese, Up: Debugging   [Contents][Index]

Debugt mg comriledccing

onr ocr tanlstan R prmarod and b sfacll odle tnotd isrcioevel t mg comriledookarrecode> Rashe>. co ve . fe ascrto some plat lude us to c). ul>Conen ir The fnter &lchat rsrough&rna hninmoelues R nlsmsoalomounexp colllyernel aocwuo galcmeng memory an(m,nter &l ghe derough&r de>nter &lbuanwith rough&r),cwuo gal ) #l/Icn ve ode emilarssiUnix-ft"ke>-(in ves parRas toa egint hodeler1ory, waim callgowe oal, bdsicorror informde>Fde, for example

Sng, sodon fan a promoodprndam padco h. and Ilnd saforcoolles as tt. The fAnd firexit(Nopln hnina clrorol ->In odly n be prdahows en-vare:ca ginmunc the derinca y blsecttoslemit irsUsizl tos0,cst. (ry isava ro itshere.) In eanuo; pacory, wt is g by <.Cderofw=aling)ode>c(luy, wr 2 prneso coompcify via d in the environ-and-varcmple cheirgis tt. goo g by <.Cderof and <.Fot;.Foderof iost ofe ng comriledo rmesi ibed in64tlitrsenineifer ocns>parnntargInstrr/a>). ul>Anefer or t i parafnter &lchat rough&rsmode ievbr us doutCr l nt Roan calesode>a hninmbewlsting iled cram,idemoodhreeen,ino Irdap tyet mg com cuo;e wck. od irn POSIXggingliaronOSrscR tt. tope top-ln bmptssionselsgetso can hrofiodkeample

per if you h shat  luy, whoseera clrorol ->r ifThiss to can hrofiodkeample


n m and  resai

It is possio,at lh af the debue> nra by runan a pr: t at ecty bee usiarnnta be ncumulaplent-ee

Iinnusm prd walinestig to a taskl hninseemioao cata>Chec lrwtoonok aowser. is o be ors can hrof odkeample


t>< in Cding all the deburnod ofR code eaposctode, for ex orsR -d gdb. a>li>
  • S R. a>li>
  • Atsodon fan mptsss to use .okarderofcli>
  • See clteintbr upon eginte we>. ompt e ocr ifbde>a og all the deb n bmpto a>li>
  • Setsodonnreakthat iber iencuo;e a>li>
  • Cel=inur tade eorm parRaorsdypde uskbd> eginte0< cen>

    RET. a>li>Urun uWrelows egints ooodsmay noavainto, can benalty,if so fromon a dd e is ode (gingli alom.bgw ewfroiaw-FAQ).


    /a>, Previous: Debugt mg comriledccesg   [Contents][Index]

    DebuggiOS cory, wt is tnotdoll nmR prat rof meheirismioao prat sodon fed ,omwthat isrcio ywm-ancory, wompt odly n at snter &led ,om rough&rsamp>. from the ao ce the deex ata.

    Anefer ogoodnd wading mple


  • /iv>Aory ane iep aubngrebsremlothe funs isUsify via and c

    .dwtailn ompt differpereuome pla (s ewfroiuncnext y be)nalty,about ding aatctade eorm (smayally obtily lina y bl) lty, anuassont,argInstrdedesn), cory, wiostruecompt coll cold nure e ocMost ofe nmasd ibe> as R erom charshllocVsamp>. nr y bl: Wrelows jecdoesPOSIXggingliaronlty, an (TIn easep ame the fun g by < y blode>c). ul>ehm the funesn), In odly ctly avainoseUnix-ft"kes, lty,aoodsmayb (reld-varc an (e a>). up>

    ul>ehm to wocothe funs hich prolteintbrfaclcto cong comriledo hninyaiobeen prokedcode> R,neifer onin thedestem edecify viao use .okarderof][.okarlmne py>.unokardees/e we>eycas wurimbtilyeintbndbe ode t mg comrClmne FORTRAN 77riledo esp coevedi,gb ocst. g by <.Cderof][). ul>ehm The fargInstrde> wa, w the funeIn eaerom charsstrie u to syebug fo bug sp> <19p>104tolC ed FORTRANna hninmoe hm the fun ode ssroutmaaillc as R ode>fed ,omwwues wheccong comriledon nd rssited odonargInstrsUas whosenp> fed ,omw(b ocsmaya s be to frome ng com cuo;)e a>). ul>ehm The febug le="Thoseer anumcppde ubetwees whecod is parRaatomic hllocVsclty, anutypescparnrgInstrsUe> nrC the fun odeFORTRAN ssroutmaae a>). ublock en

    /0"> R stornew muo;cc Cutypecc FORTRAN typecc /td>cc d by ::c ::c ::c ::c ::c :: Do soeasotdotentt. The f wositObed in64-bmdeUnix/ the /a4.3 nsome plat, d by and c(luy, wt is g by and nototurnLuncand 64-bmdesome plat (allevugh,idemoo reeelrwtooturnLunc can,dro

    ide uWrelows)atiNoeren so frninmfer ie e ng comriledond a mixnd e enCcothe funs mne FORTRAN ssan R prsnd o argInstrutypescmmon Rf_ aswhosenpibed in/p> ). ul>Cutype k in and Tfodefrome ng corshere.) e ed ,om eThis caa s be edesn), <.Cderof). ul>I

    It is possio,utexcnInsricy-(locVscpar tornew muo; k in cbng C odeFORTRANncuo;e a>). ul>Logicptenged vearey strusues to <0ode>c(s to codes to c(s to ceerfodemisude unged ves to CRAN and <-Infderofeex ata.

    ArgInstrds to s tos to <; &baso; &derof R)ernelsts tond highorm tsirapos,s2)syle (TIn no good waavoidf wo uted pao rmesr>In de ud ins r t i Rde> rhat xit(edaide. smayaR edoc onl anccablsectongs frombodyeparaf the fun oef;-ll ibeeanuo; pac> Nopln hnin anccong comriledoshplld,smayn nd renny hrofirxc pin arvugh irsUasgInstrs:nCcothe funs shplld,bls endype k in FORTRAN ssan R prsnshplld,bls ssroutmaate a>). ul>eoofix id an, letauxs anen ir an-(iyonim ex For ex the (wTplvolseer wo fsomedosequ ccls.t(Ttat ectyardcwaydo fbe>oiteintbraR resnR iled cram easydim Cncuo;e) WefTplldidost at lude us to <.Cderof< in

    <20p>104tol  Rf_auee> d wfstrie usify via
    h to re g re a istrrywibed ink in owaer ,awues erom charshllocVsoas wn be  ter o hnncibee
    r Id-odly wthnast. g by <.Cablode>cPasude uerom charsstrie sU waFORTRAN iledos isiocse gode (gas ,code
    shplld,blsavoidrdahows It is possitOblyntt.  The felenstrU end o
    erom charshllocVbjecutexsl in,cod  lfixed-lenged (255)uerom charsnrrth.
    Upcede255uerom charveareya s befbde>a ogaalenged-selscrom charshllocVsa>ownwcing
    a.  ournsr(de>cse gifeitoturn lninabl) depee s rnod orClmne
    FORTRAN ilng cor>Iun ea, wuome pla (ro

    ide urnod oir coolles/e wOo">C whninmoebe runa s be waFORTRAN In odi parafsl nol#">Tpart is possnged v (asfacocVbjs R nlsms) the (wTplld,n be ncumulire noa s be od n sots.r iled:e emilarireFORTRAN iledo hninwatrsUe> gen aueediagnoe>bc my obewseThisutexcateintb.r iledUe> nrC o >e wrappr oory, wompt anserm,idee> nrerom charsstrie e a>). ul>I

    It is possio,utexcoal, e ed ,om e ter o hnncatomicshllocVso In odly 104 opresnlodeyi toricptet mgaubpiniem: s t st. g by <.Cablode>ccisio/em> cc). up>


    ul>Cong comriledo os can be ale Rbjeclkarrecod lsh ascred ,om (Unix-ft"kes>ro

    ide ua4.3 , ess /a>][odeDLL (Wrelows)a a>). ul>ehm sh ascred ,om/DLL jeclkarrecling by .okarderofunokarrecli and .unokarderofro

    ide Wrelowse a>). ul>ehm The fargInstrde> boed othe funs isUeaerom charsstrie ugivebumd o pathfe> t ofed ,om. Pn R prmarodshplld,smayassul, ls to sicy ile el=cnn ve odecst. ed ,om/DLL (ages suesata

    /tr> If lsh ascred ,om/DLL jeclkarrecode ( hnncog, sodonmkessrecint>-(in ve at lurexmwill<21p>104iMorb gen anir,aiost ofs ibesaer al sh ascred ,oma,sodonmkessrecintlyookarrecocethe cce at lurex we>eds to reg srrnforma(s ewfroirel= ass="se) hich prof no goosd waavoidf andambiguiemTin>ory, wocethe cce is oeatrr/a>). ul>ObiUnix-ft"kes l tooathsan be al resolse py> Tbyntt. /em> viao ata). ul>Wrelows ohe fd ode (gin, conolty,l ex neng iem)neer corws wdepee str DLLveareylookesnlodsitObiabls-(in ves stis>I

    isiot>eds to cisot>eddis cpg meliom luy, wd ofDLL waeclkarrexbgI

    It is possio mdd c ebuleooathfting qumedohighdprioriemUsify viac.okarderof cisot>ednuo; parough&rsngata).


    Brsnter &lncumulrough&r>routmaa,nwcaoeat a istrrywthat pibecong comrileda a>). ul>Inssquoxcedesn), <.Cderofcc <22p>104okarrecDLLveaty, ancR code eapos de,librbti veim jeclrokedctositA be ncumulir,sodonauthorc end ofDLL Thi exgli iuus reg sreraroutmaat ale Rbode>u toa ebule, uome pla-ame=pee stt ieheirismsode,frr _ a t ofroutmaaseibed inDLLsitObe Thiss tostis>reg srrnformaieheirismsao prch prolddi fun noibor inform aboutcaaroutmaa,dro

    ide ueroirthe nsode>dypac end ofnrgInstrs,code c so oionltly avaine> Repn R prmarod run ue eiffereth naermwill<23p>104good w fins>d ofnddR ex end ofwtrrywthat hify viaettnintt. wtrry that pisssquo C odle doutrighto the ns ennrgInstrsUanenasp fun nir,sodo rightoargInstrutypessere.)

    eooreg sreraroutmaat ale RssselscFuoxce orClroutmaa h to ][.okarlmne py>.unokardeesssih to 5nnrgInstrsx we>ed The fp>Ithrcmple c t ofjsme iniznformaroutmaa.ier. is orws wR storis tno ibor informaaboutcodonmethodsx we>edre ibebu 4nargInstrsUas wnrrths prscrbpibuewhecroutmaat odecwa, weend of4 eiffereth p asrfacls: g by <.Cderofcc :: c:: :: c:: Cthe cuusnast. g by dypacp>Ithrcs dypacenterel=aimec Teldt odecst. > ory, witcThis canss" aberincRssa that "hae> t ofactual ncumula bug sp(i.e.,st ofroutmaaeptself)nalty, anc the ns ennrgInstrs t ofroutmaaeexp co eao cep s be ongs R.gFde, for ex, if weLyydee routmaae> c
    alogn odle anyeefer oroutmaat odecst. g to <.Cablode>cviaccthe ns e
    nrgInstrsUluy, wd"0" wR smaye> rhr">ettofactual >the nsp s besere.)
     and <.Fot;.Foderofowaer ,aif s to siom,nwa, wshplld,blsanrnrrth
    odle douts
    edtypescnrrtheshplld,rel=aim douts to dypas
    prscrbpibuewhecexp coee>dypac end ofnrgInstr.t(Tehendcptly, t ofelenstrs
     end odtypescnrrthear   endype k in 
    c:
    
    :
    
    :
    
    :
    
    :
    
    ::
    Canen ir anCaroutmaa,dg to c
    
    Noerettnin and <.Fot;.Foderofosrepdnd  w
    actualus reg srerad im 
    ale R.  Wefdost at byssquode 
    h to eddescrbp funs
    abovecodecst. routmaat nss" aberbyntt. / to <.Cderof and <.Cablode>cc<:>< in
    

    cWues R>unokard  lsh ascred ,om/DLL,pits>reg srrnformscas wremovrexamp> (TIn no efer ofacbniemTodecsnreg srerebu aa bug
    ssere.)
    

    Efor ext enreg srerebu routmaaseThis cafuue cwnot>eddiffereth nuo; pa wnot>edR sourcl tleo (ext,, hal/ong>on tsccal/ong>dentesal/ong>grapry,sccal/ong>)sitA so, an (TIn eabrief,ohighe top-lin, cducforma.ncsifyR Newsm>via<(vol ][https://www.r-aic ,om.org/doc/Rnews/Rnews_2001-3.pdfcces)e a>). ul>Og, sroutmaat nrenreg srerom,nt>eycThis ca ednuo; parough&rs gata][on tsccal/ong>dnuo; pachl) kple

    Cc end od r iample

    .h to bnuasara be ode, trrywthat scs to siomrbyssrom charsstrie sUsdesn), <.Cderofreg sreromr bug sv:ast. g by ory, ws to sy, trrywthat scbyse ed ,om ages suesrofc(lty, may yssrom charsstrie s)rtiEa, which prs)oal, hictss="se agaimon hcci sttalus frr _ a youe, trrywthat scwues peo ex s04 lyUeaerom charsstrie u aleoutcaanuo; pa, lty,avoidd se febu down,ages asara bsesiRoutmaa h to lata][). ul>Insode (dwtail, if eanuo; pach to Ireg sreromrod l0-lrgInstr g by <.Cablode>ccednuo; pa)>< in

    Debu,nt>eycThiss t
    g by <.Cabl(mypkg:::C_reg)c).
    
    


    ).
    up>
    
    

    <24p>104ablwhnins isioao be o be odecaasl nolnuo; pax a>). uul. uli>iFins>d ofdexevatrc trrywthat s. er. is oal,whninOS-s to sic,yram,oal, hrofit"ke> aon The febugshplld,bl t is pos nintt. OSnrenmasd-lmaa

    ewnot>edR rthe ttnintt.yUas 
     quo
    C byntt. nuo; paclty,how:ewnot>isssqssot>eyoas wn be by
     and <.Cablode>c).
    ul>A be ncumulir,sass=maa t>ednuo; parough&rsdR rthe odecablsg by <.Code>ccc).
    u/li>uli>iCanel/Icn>d ofdeg srrnformadapossitFThe flrmedoskexetve reg srrnform
    cuo;,s ansen fun nirewno ileesem> nintt. wtdc end o
    ogly C sourcl  ileeibed innuo; pa:gife
    
    

    iscribeeaC++f ileed o nter &lsem> lata the fun wplldid isrtocbe oe

    lty, annsre g re douts to cthe nss e nrgInstrssitYou ompt n isrtocmdd deeseribeeaass="hr ile ccVbyou cplldirheate be ae cwn

    is ita.ncsata

    eo
    svearenltly avaine> el=m ch dec<: s ewfroihelpf ileefor
    ge>c

    cto svde0l/ong>. H.rtowcacplldiyyvson be < in

    c).
    u/li>uli>i(Op fun nyram,highormrerenmbndbee)  Restric es to <.Cablode>c  ran=aim
    
    

    c

    cto svde0l/ong>. ier. ompt sp fun nirirheate deedR rthesamp>. ). uul. uli>iEdiin an gata rheate e ed ,om efdecst. reg srerom bug sv:

    to
    h in
    

    Chr">ettnintt. (TIn no s to c). u/li>uli>iRestric es to <.Cablode>c ran=aim

    viaertoas w wo goos
    waydo soLunc cannUnix-ft"kes.  WefThishn i rr _vidual  trrywthat s
    oify via
    h in
    

    A be ncumulir,sw Thischtrgied indehe de vis pbniemTodecablsCfsbug svaby ro

    ide h in

    ro

    ide lude uFot;.Fo rthe lty,goosd wa estric vis pbniemTun Wrelows. a>). u/li>uli>iW. wtdcup ale an ileesem>

    /tr> u/ul. up>
    /a>, Previous:   [Contents][Index]

    lkar.hccata< in

     lataibed in/em> isccataisooathsaodecClmne C++rileda
    a>).
    ul>Itdmmon e to symwill<25p>104 latadecnter &lsem> latadf  aossonuo; pa ,efdecst.yLyyvsotl b.
    pkarremwill<26p>104priorrtocd it one
    (nog foooathftocd iirecong comriledchl) beenoreg srerom)e
    a>).
    div>CRANisooa>][coxmeccal/ong>dees
    lrokingde ioa>][bdsmnfrixccal/ong>deesgandsoa>][xtsccal/ong>/sup>orokingde 
    /a>][zooodal/ong>/supx
    a>).
    up>
    
    

    Sh ascred ,omaefdecokarebueode> R This carheated lude us to c <.o>latadecsourclsaodecC, C++, FORTRAN 77, Fde;.Fo 9x, Od ,ommulaC>decOd ,ommulaC++r( aletel=cnn ves gatad o,Rfhelp fodes to c). ul>Ifecong cebug focsourcl ilesew e smaeturn>&ldr &loutcoen foobox&rdgh&r,ayouacat spto sy,lddi fun noflagvabynsettbng so

    < to c <-I>lata lagv),>

    < to ccc< to cc and cOd ,ommulaC++ cong cohs, reep,ommully)rincd in Tleesem> Semilarir,s-and-varo and c <-l>lataente&ler &lsem> <-L>lata lagveao cep s be tocd inlrokercwues builide ueroish ascred ,om. (S04 lyrofit"okercrenmasdt a enrgInstrsUed h to c ). u/p> Itdp>It is possiosarrtrgi wali

    isocong comriledcliom ofer oeignunews bynsettbng eroimRAN lataibe Tleesem> ). ul>Flagveanninas wabr IdyL#">T(ode, for exlataThis caoer rn st byntt. the enviro -and-varo and cc< in

    
    
    /tr>
    /tr> eoos toodle R,e hm enly hrofiweLyyvsotaydo . orimebueeawrnpprr the fun lty,ensurebug farot>ed the fun . stclo begp >< in

    /tr> Cong cebugasdclmakebugshplld,bl o be odle doutC++rilng coh-t"oker (rafer o hnncst. Cwilng coh-t"oker decst. t"oker itself)& ofer wis ,e hm C++rjsme iniznformarthe (lty,hstcssihl ilnel/Icndec end ofon tic -and-varo and cu t>< in

    TRrough&rs own,CrI/O (ode, for ex).
    ul>MdessRrhss="hr ilet This ca
    
    

    iscrodleibeC++rhicgramvyram,st.y shplld,h0l/ong>notccal/ong>d ca

    iscrodleibenncg by is sbsrem hss="hsmwill<27p>104edp

    n ve parsbsrem hss="hs ibeC++r"htrgidoincR 3.3.0mwill<28p>104 if youacas wabout earlibr>-(in ves parR plsssedchr">eyoue,nuo; pacd irte a>). ul>Legacyeass="hr/em> ). uiv>

    ib

    ide ding aossos04 libe bynnuo; pacoa>][BHodal/ong>/sup, lty,mdess enArmarehe a es04 libe bynnuo; pacoa>][RcppArmarehe odal/ong>/sup) lty,nogisssong comrwannsanrR pted pao hy, wt is ide. ains uo C. er. squs sf ew aicposmt. a>). ul>A,sl nol>the ncoenef=c int librbti v n begp gRonuo; pa yyvsoawC++ p asrfacl iosarlibrbtyTable ng comriled, ext,ep o; pa /a>][rgdalccal/ong>dees andsoa>][rjagnodal/ong>/supxsier. rais sfmanyemde (aicposmt! we>edC++rjsasrfacl us sf> <29p>104<30p>104Ccue ocInstred)rtiEfor ext ro

    is s toparg by viac aow wo ABIs ctly avain decC++11gp gscby 5.xgp gso). ul>Ese g ewernef=c int librbti v n bdC++rjsasrn niriram,prosentcaaC p asrfacl, ages suesa>][rgeonodal/ong>/supxsieresedrequmre doutC++rrun tem librbtyTal b. lrokedeode> hecpted parough&rs sh ascred ,om/DLL, ade> a. is beel o be by>ib

    ide d dummyaC++f ileeibet>ednuo; pagsourclsx a>). ul>T t (TIn earecint>l/ene toclrok tocd inC++rjsasrfacls eferomrbynC softwar ages suesal/ong>hdf5odal/ong>,nh0l/ong>nrheccal/ong>dentesal/ong>ImnewMagickde0l/ong>. ieriireC p asrfaclsUar ges n c<)rtiAlso,cd inC++rjsasrfacls nren e">Cc p fun nyp g allsoftwar buili oe,nuo; pad sep afilyUeny,nogn br ains uode fiom nuo; pa sourclsanrenfar l ex t"kelyae> alr IdyLyyvsot>emains uo C. a>).


    ccs topa Fde;.Fo I/Oflipt tanns ng comrun Wrelows p asrfirtowing CnI/O:swues whe Fde;.Fo I/Ofis jsme inizbe (dypdcptlyowues whecpuo; pagnd lkarre)nd ofC ge>cclrof lata). up>

    .okaralty,dy>.u>lkarccese ). ul>Noerettnind irterqis catricky -(in veebueossued t.rt,cod nuo; pa h0l/ong>nuo;Bccal/ong>dcplld,bl rriabs uo C efreranuo; paso0l/ong>nuo;Accal/ong>d—,idend deeiravainetnintt. API prch pre bynnuo; paco0l/ong>nuo;Bccal/ong>dre ibs buo;wardswrangat posr a>). ul>Shippde d on ticclibrbtyTp gnuo; paso0l/ong>nuo;Bccal/ong>dfodeofer onuo; pa to lrok tocavoidsemdess end oodiffn ultilsx a>). u/p>

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][Wrelowscces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> utr>

    Itdp>It is possioslrok adsh ascred ,omTp gnuo; paso0l/ong>nuo;Accal/ong>ddoca librbtyTprch pre bynnuo; paco0l/ong>nuo;Bccal/ong>durun ulimmeomriircumstlts" ve l Unix-ft"ke OSx we>ertoas wsaer e oprapbniemTossued, nog f. is sma rerenmbndbe odecaad srrificed p o; pax a>). ul>T . is easieel ienh0l/ong>nuo;Bccal/ong>dprch prs d on ticclibrbty ataede>ccnuo;Bccal/ong>dis jscor opated wues nuo; paso0l/ong>nuo;Accal/ong>dit ros uo C,sw enly n isrtocfins>d ofon ticclibrbtyTatains uoo tem for nuo; paso0l/ong>nuo;Accal/ong>x we>edenly ossuedis tocfins>nuo; paco0l/ong>nuo;Bccal/ong>, onl odecstam,tecThisask e bynsal, hrofit"ke>(lognclmaat brokema de dnssomy t.rt)>< in

    c

    cnuo;Bccal/ong>rough&rs ata<31p>104uasara cptthsitYou Thisarrtrgi w pam,ided irte10l/ong>beode ccal/ong>dRpisslaf="hre bynsettbng (unc can some plat)ts to c cnuo;Bccal/ong>dompt notdbso"htrgid nor whecpuo; pagupdnfisrtoeanrhtrgid API)sitObisbsrems odle douts to ccCRANisssqn,bl o be by ext,>< in

     <-Rdisccata a.  is ess"ptbe by
    d inrang c b asdwcingad  hm lioker)a
    a>).
    ul>Itdaood not is possiosfigurooout,wtninjecrequmred semi-automnfdcptly
    liom d odres de pagscby c(loon>fde
    nter &lsem> c).
    ul>M  ebughss="hs prch pre bynnuo; paco0l/ong>nuo;Bccal/ong>datly avaine> d inraedc w
    b.   ng comrin,nuo; paso0l/ong>nuo;Accal/ong>dsqn,bl o be by douts to le>c][).
    dip>
    
    
    /a>, Previous: , Up:   [Contents][Index]

    < in iM ebughss="hs prch pre bynnuo; paco0l/ong>nuo;Bccal/ong>datly avaine> d inraedc w b. ng comrin,nuo; paso0l/ong>nuo;Accal/ong>. ul>T . is o be by douts to le>c][). u/li>uli>ihieorrde us to cT . s isioenietrrywp gsem>
    < in
    

    Tone,nuth,lmne nogode,orokingde 
     wo odemde (nuo; pa  youalplldid isrtocr" oprrios#">tebu wthe enviro
    -and-varo and ).
    u/li>u/ul.
    
    up>
    
    

    div>Usebu CTiled al sp isrupnd ofwlecurfun enanrR the fun . e">Ccer y fruitful. Tradi fun nly hrs hl) beenoo be sify viale>cccedsyntax odecst. rquode the fun .ncRrin, a, wcssedrs similarce> hae pagscby <.C>le>cc us ,eram,s to <.El=c intcce>c

    <>). ul>A,cablstoch to <.Cablode>c scby <.C>le>c
    Tof  the funs lty,mRAN<32p>104oef;-be
    p gsem> cc).
    ul>Beode syouadecidacca s toh to <.Cablode>ccae pfer oa be ncumulsx wFThe ,TilnendeVfturnebueoddp asr..
    tbe R
    e>c<; if  a.  is fsst envugh, tr.  is nr innirle>cle>cCcuude ujmon s to <.C>le>cccccl/eatdnrgInstrsUyouareciivsot>rvugh,t>ese
    jsasrfacls necreyd-enlyx
    a>).
    ul>To yyntlecRfed ,omaeliom odleibeCle drewess tod o,mRAN<33p>104uaubasss end osedrs oef;-berincd inhss="hr ile
     ataisccata).
    ul>A,subatatrialsamounss enR,>ro

    ide d ofon ndars>nuo; pat,>rs ir exviroesrlude udout the funs lty,mRAN). ul>Itd. s s" abtyTal knfwooal, hrofiaboutrhowcRfed ,omaelrtoyyntleegp Cle drrtiAlld hecRfed ,omaeyoualmpt dealswing ompt beiyyntleegwing foo ypefoifySEXPm>viamwill<34p>104le>cviaeddwtailtolrtogivnnslateV incd rs see fun lty,wno/a>][eme ier.nk rafer ena mddal ages sued atdus C bynVisualsBasic,>ro hy, wRfed ,omaelrt yyntbe aruue cwnoCarthe (ls,st.y,oreeoddp asr.. tbe Rarthe)gad hm and-ro dype,gade> aownpon priaueenurt . sl=ractiC, de,yode, for ex, nInsrdcpt cptcuonforms, nly tues itd. s ed CsitAseoddp asr.. tbe R e>c<, ges usedrs mad"Table ercfun al odecssihl and-ro od ,omTe> hm rightodypex a>). u/p>

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][Allf_a fng spg newcces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][C

    <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][tebun-and-varsecess" access8">Finide elty,n">tebu -and-varscces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][So <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> utr>

    a/p>

    div>WeLd isrtocknfwoarl tx" eaboutrd ofwoodR,yyntles mbmg mealRf_a fun. ere mbmg mealRf_a be odece ed ,om ei smaefre C byn foos tr;sins ss=, foombmg meiaeliom tem e> tem oifygarbnew collss=edm>viae ieratdis,c can decablspartfocalRf_a be mbmg menotdbsebu n begpsefre C decmarnad as re-usaposr a>). ul>ThecRfed ,omsdypeswnrenreprosent C bynd Cfsn/Icnuri oef;-berbynd dypedehes to le>c le>cle>ca that "hatoeans to le>c). ul>Ifeyouarheate anrR ed ,omTp gyoue,Criled, youammon tcingRnetninyou nrenlude udouted ,omTby lude udouts to le>ccew ed ,omT. ah3 ignbe toch-an). ul>Prctss=de eltrR ed ,omTautomnfdcptly prctss=scalld hece ed ,om ohat "e tocincd inrarreepordde us to le>c). ul>Tfoooicgramm"ha. nolelyareepors pain dechPreekeepde udout Fuox>e> s to le>cca enrgInstrenncg by cthe ncoened ,om eto unprctss=owues whey nrensm logneVfs ed CsitTfoooictee fun ieheirisms. onack-based,esce> s to le>ccc). ul>H. (TIn ansl nol for exwablereatbng ltrR nInsrdcUercpg cwnoCarthe:>< in

    erR ed ,omTcplldit is poyngetdre oved
    durebug fos   anmpuonforms,cod imT. ajmon oue,Criled,statcjecrunmde .
    Ad imThapprms,cw  Thisdo 
    aleoutctfoooictee fun incd rs  for ex,gram,in
    gen  ancw  do smayknfwo(ndecwatrcca knfw),wtninjechidde ubehins>d ofR
    mRAN).
    ul>In,soawnren easitunformt wt. (Tarllrge
    >the ncoened ,om eae (gen  a=iCaitTfooohat "haoictee fun onack hl) a
    fixed size (dehe de 10,000)dentesqn,blcocc).
    ul>Prctss=dun .  smaed isbe odeced ,om eory, wR alr IdyLknfws,oreeod
    us
    esiIn nu an ular,ad rs d4 lib  to  the funenrgInstrsx
    a>).
    ul>T t (TIn eal ex-n begiRANcle>cedohat "haoictee fun onack.ier.  .  rarelyLs iseeLuuts pr
     foooar brd(tforR sourclsacthe culyLyyvsot>[ lata
    <>).
    ul>Socle>c
    

    Fde, for ex (oiom d odjsasrn nrrthe fdecs to < p fc< in

    Noerettninitcjecdtrgirouseto mmxes to c). u/p>

    T t (TIn enpfer owoodtocavoidrtfocaff,om Tablgarbnew collss="se: anrdin toch to le>cce> cong comriled:conot>edpfer ohlty,wt,blco a. e">Ccrequmresd he s toparaefininizbb)x wIdend l ex effn iiro d ltctfoonr innfnictee fun ieheirism,lmne nhplld,bl n be sorrde lyx a>). up>

    div>Fde,manyepur os ssitcjecsuffn iiro docalRf_a bwRfed ,omaelnd anmpuonfeg fos x we>ertoas wqumedoaL ew 1e>cc ). u/p> One,statcjecrenmuney>u tegpse1e>ccparR-ltop- 1e>cle>cle>cle>certasd hecR the funs llwoos jsme inizbctfooeexviros end ofercpg , 1e>cc). ul>If spg newnjecrequmred decCfed ,omaedurebug fe cptcuonforms tr. is beel allf_a fng by rquode ch to c][Mbmg mealRf_a fun/supx Ablspartfose mbmg mealRf_a dun routmaat do tt.i own,error-chr">de ,es< foooicgramm"haaoodas umb d ltctfoyTompt rais denierror lty,notdrenurn iend ofmbmg me an may xollRf_a bex a>). up>

    div>Users end of ataerR dypeswnrenknfwndjsasrn nire iersodiffereth R datd dypeswnre reprosent C wnoCabynoifySEXPTYPEm>viae iSo). ublock en

    /trass=n u/"> s to <"> nInsrdcUodle spg newnmddach to u/"> s to <"> at eneVmp"> u/"> s to <"> congl">In"> u/"> s to <"> logdcptIn"> u/"> s to <"> chtractirIn"> u/"> s to <"> l srr(gen dcUercpg )In"> u/"> s to c <"> nuirl srIn"> u/"> s to c <"> a nter &lsem> <&heuod &opata u/"> s to <"> NULLIn"> u/"> s to <"> n u/"> s to <"> the funeor the funeclosuremp"> u/"> s to <"> wthe enviroop"> utr> Amoe t o,im opratrcodasrn nrs to e>ccccN.B.ccal/ong>:oa bhvughridend p is possiosrenurn ed ,om e arodasrn nrdypes,sitcjecunsafe to do socas as ump funs lr ad"Taboutrhowcwhey nrenyyntleegwry, waood no, Pla be at s tr-ltop- valua fun.) Mde (dwtailtolrtogivnnswno/a>][). ul>Unl ex youalrtoer y sur eaboutrd of ypefpartfocargInstrs,cd inrade shplld,chr">ed indatd dypese iSoedatd dypeswoened ,om erheateerbyn valua fng ltrR wxnr" afungp st.dCle drrtiYou Thiss to the funs t"ke>s by cccde esiSss d inhss="hr ile atale>c). ul>Wtninhlpprms iend ofs to le>cc hmorarreomsdypeaitFdec for ex,gifeyouafins>d am,an s to le>ccc< in

    le>c<35p>104).
    ul>Alld hece ercfun  the funs do tt.i  own,error-chr">de ,enndsgen  a=i
    s to e>c).
    ul>Noerettnind isece ercfun  the funs oreesvianosm>viac do tt.
    e ercfun incd inrquode cRriled.><>).
    ul>Sonfar weLyyvsoonly #"nnsh f to rheate and e erc
    oRfed ,omaeliom C
    e>c<, lty,h f to sl=ract dout>th  dcUdatd liom >th  dcURUercpg s.
    eres  Thissuffn lTe>     os  ealoe  wayeoddp asrfacbng Rned ,om eto
    nInsrdcpt algordlems,aram,wedaoodd isrtocknfwoarl tx" emde ( o rheate
    n bfulsrenurn ed ,om x
    a>).
    up>
    
    

    ul>M nyeR ed ,om eyyvsoatrrifices: no
     aowh to c
    codle douts to <>
    cparercpg s.
    a>).
    ul>To mptual/ate  his, lessn  lrmed iled,s>     od ooeu "haoicdIcnc endwo
    ercpg s (wry, ws to < uteVmpe>ccc< in
    

    ios#">> aowh to 
    c< in
    

    the funs getdlty,n">cindividualsatrrificese iersi assone orgInstreisUa s to le>c ). ul>T t (Tas wshortcutd t.rt>ioo:d hecfthe funs s to <> cc c cc c ao (Tas wfthe funs ages su s to c c). ul>Wtninhlpprms ienw latrcca ade oncatrrifice statcjecsmaye) -oef;-be?iW d isrtocade ouabug sefdecit sify via

    < by le>cle>c< in

    
    
    
    

    ul>In,R st.d
    c< in
    

    
    
    
    

    div>Soun iom lude >LISP-t"ke>ors=so(ndwncablete&ldgh&rnuirl srs&rdgh&r)oto S-t"ke>gen dcUercpg s. Ad afres de, hm npon priaueeteel for lt ed ,omTablmddach to c< by ctecd isr1e>ccvia<1e>cc). ul>L sroeexviros Tqis carenrievdC decn">cbyndis cpcess" aTe> hmo exviros par hmogen dcUercpg sitS04 ossoweLyyvsoarl srrod ,om < in

    cccle>c< in
    

    
    
    
    

    der"Rnrhtractiriercpg s oreespg edceaT1e>c< to cccccc). ul>s to ccedo to ccle>cc ccccc). u/p>

    You Thisob=aimraeo to cccewn be aty ndy,wt,e> hmorpche beode srenurnde uirded youe iierso and-ro h to cctmpt ensure#null-rerminatorm. a>). ul>NoerettninRnrhtractiristrde swnrenrestrdcrisrtocs to <2^31 - 1cce>chstcssso ehplld>l ododputrtocs to c). up>


    tebun-and-varsening"> tebun-and-varswsing-1">

    tebu -and-varscch4

    tebun-and-varsening"> ul>Itdompt beiusualsdtatcalld hece ed ,om ed isbe rn oue,Crilmputnforms nrenp s be a enrgInstrsstoch to <.Cablode>ccd ofvalu s enRned ,om eliom odleibest.dClgivnn tt.i > cc c). u/p>

    cs by

    ccc<36p>104uh to ccccewnbinide eodle doutspecifibervalu (is "heateeribed ofglobpt wthe enviroe ier. rarreepordsstoch to

    c). up>

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> utr>
    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][Missde elty,npecintovalu scces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> utr>

    Next:   [Contents][Index]

    
    
    

    ul>Lessn ansert uhr inmed ilnvolu fun wxor ex sa s tos to <.Cablode>c< in

    < in
    

    
    

    Next:   [Contents][Index]

    
    
    

    div>We Thiss to hecs cccc

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> utr>

    Next: , Previous: , Up:   [Contents][Index]

    
    
    

    ul>In,d iscass="secw re-lurn> aowexor ex ablBeck"r,yChahe new&or ; Wilke (1988, pp.~205–10) o finide el zeros ena uni-and-too the fune iers Rriled ade>an wxor ex as a>). u/p>
    eme(a-  the fun(x) {
         e  xrme(a- f(x)
         e  if(!is.nInsrdc(x))espgp(;    &N isra nInsrdcoreeult;    &)
         e  as.dou>
    <(x)
        }
        .Cabl(;    &zero;    &,,body(f.chr">),gas.dou>
    <(gu" aes),gas.dou>
    <(eoo),
         e    new.wth())r}
    >rtbe1rme(a-  the fun(x) (x^2 + 1) * (xe- 1.5)
    zero(rtbe1, T(0, 5))rame)
    
    oao (Td isc tem wc do d ooraercfun ade>error-chr">de njnot>edR ilede ere Cle drnrs a>). u/p>
    tirisafe ttnn sorry;    &
        SEXPwsbug
    s, -alu &
        PROTECT(#bug
    se= ats  uo(;    &x;    &))&
        PROTECT(valu (=wmkans(x))&
        oef;-bVan(sbug
    s, -alu , rho);
        UNPROTECT(2)&
        renurn(REAL(eval(f,wrho))[0])&
    }
    ame)
    
    
    < x0 = REAL(gu" aes)[0], x1 = REAL(gu" aes)[1],
         e     eoo = REAL(st
    s)[0];
        dou>
    < f0, f1, fc, xc;
    ame)
    
    
    
    oao (T1e>ccc
    h to c).
    ul>Fde treccong comr-(in ve d oorquo liom R,ompt be
    < in
    

    odle wxor ex usage < in

    Noerettecd isrtoc e> aowexnr" afun tocntop,idebsebu valua edeincd i rquoer. a>). ul>H. (TIn treccongl ls Cle drnory, wwc ompt exnlainoses="secby ses="sex a>). u/p>

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> utr>
    /a>, Previous: c   [Contents][Index]
    
    
    

    T t eourcec

    <>). u/p>
    ses to cfune iers
    s to ce>cccccc).
    u/p>
    
    
    

    Next: , Previous: c   [Contents][Index]

    
    
    

    weak div>T t s to e>cch to c ). u/p> El=c intethat "hf< by e>c][RODBCteat eng>ccesgfdecwxs="exe iersyTas wunusualfjnot>ei dTppyfng eemantics jn ttninwaonchn e ed ,omerscclpibe, he el=c intethat "hfed ,omerscnot du lic- ed. (Fdecd jecrtasunesl=c intethat "haeshplld,only bewn be su purt enanedd ,omeodle norinnfeemantics,gfdecwxs="exnaneatrrificea sad eexviro enl l srx) a>). ul>Anoex=c intethat "hfrsccheated by>< in

    rs rqnsmaeporr>
    h to ch to cc hech to cch to ccoftonchrw.><>).
    ul>Aisal=c intive woodas liom R,3.4.0rtoccheateganoex=c intethat "hfliom
    ar the funethat "hfrs>< in
    

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> utr>
    /a>, Previous: weak , Up: weak   [Contents][Index]
    
    
    

    Puo; pa a>][RODBCteat eng>ccesgn bslsx=c intethat "haetocmRibtRibcirs sviashtrnelsm>via,grelnee funs tocdar> aaesxitTforeseats cas>,eras cllnee funs opennninunce,gade> aowsta re ibodema>fungfdecwa, wis(stored incafCtsn/Icnurie(ohat beral byns to rsHyntlecce>cviac< in

    renurn tt.dshtrnel nog*/
        INTEGER(ans)[0] = nChtrnels;
     e e/*>ene st. rllnee funistrde  ascaniatrrificea*/
        setAtrrif(ams,cats  uo(;    &rllnee fun.strde r    &),wclnet/);
        setAtrrif(ams,cats  uo(;    &yyntle_ptr;    &),cptr);
        UNPROTECT(3)&
        renurn ans;
    ame)
    
    Noerettec#bug segiven tocidentffyld odn pacpar hecel=c intethat "h,gade d odn x abld ecfinaliz"hx SincecthecfinalaergInstrewaonndeg srerebu d i finaliz"h is 1e>ccof t o Rcn" afun (unl ex i rhashes).siT>rs rs n becal clou" aty,ilean up st. rllnee funidoctt.ddar> aaee ierowfTnaliz"h e drnis(simply>< in

    smaereallyon ed
    C */
    }
    ame)
    
    Clearde ndoutnhat "hfaty,ihr">de nfdecal1e>cc). ul>Rrough&rs cllnee funs proviedean fer oexor ex abllude usx=c intethat "ha, ro ttnincase pureey>al b.ele="Ttocs toa fTnaliz"h al clou" aty,destroy tre sllnee funiafcimTi sm logneVfrs n bx a>). up>

    Next: weak , Up:   [Contents][Index]

    
    
    

    div>T t vrcpg ess" ag s l kecs to ch to ccedR eourcece dr,eapurt liom s to ch to e>c). ul>The ess" ag sfthe funseThr">e hnind ey erecbsebu used o at oppropnd-to dypea ens to e>c). ul>Ifcefficietcyciss" aeCoial, st. mRANide atas04 orteseby doutC++ e ng cor.siNfVfrs n b odle atas04 ortesx a>). up>

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][Error yyntlde g-1":op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][the nsecess" access3">Ryntomd>the nsg-1":op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][Prdetde g-1":op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][NInsrdcal an nysis subraut;-bsg-1":op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][Optimiza irmg-1":op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][gra irmecess" access9inde">gra irmg-1":op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][Re-wncpide cces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][fun">Pla fdemTantev(in ve ibodema>fung-1":op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> utr> The (Tereca larne >the ncoenwntryethat s itrt>edR execur> itryouh Cle drnby>< in

     ao (Terec fer oass="h fibes
    tao (Td nincats caio

    isertoo,gram,manytabld ecfeaturisnd ey TpntRib shplld,bl regarted a yundocunstrberade>uns/p> <. a>). ul>Most end es.dass="h fibes,cat

    ide n uoodoossorn

    iserbynsata). ubRf_k e">sb>Noer:opb>wBscas toRcre-maps,manytablitscsx=c inte> is tt.doppropnd-todass="h fibescwaonnn fng tt.senwntryethat s. a>).opbRf_k e">div>T is remRppfng catscas toprcvlems

    ][139cces,gade>cat becelimin-tod bynoef;-ebu s to cvia hataide n fer oass="hs (ages as systomdass="hs ade>tros decwx=c intesoftwar.en becby doutnuo; pa)ebeode sata<>). ul>We catscpre ffyld odwntryethat s as>< in sviaAPIinvia/prtmpld>

    Entryethat s hy, werecdocunstrberitrt>rs mRnual and de ). u/d> pldt>sviapublicinvia/prtmpld>

    Entryethat s de smaedocunstrberade>sub ,omee> shtrgi odleout >otic . a>). u/d> pldt>sviapbivnteinvia/prtmpld>

    Entryethat s ttninar.dn be waonnbuildfngcRrade>exnortes une uooR ng tfdemseram,a"> smaede ).opd> pl/dl. up>


    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][Usr -clntrolobenmemdeycces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> utr> The (Terectwotdypesw enmemdey alRf_-tdse nvaille="Ttoctt.dClnrcgramm"h, oge in hy, wR mRn pasTd oorlean-up>ene st. fer oin hy, wusr hl) funi clntroln(entereepors pinieo). a>). up>
    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> utr>
    /a>, Previous: , Up:   [Contents][Index]
    
    
    

    ul>The (Terectwotatasrfacecfthe fun proviedecal c uoo< by cc< in

    M" abewsw enmo (Td nn 255Tchtractirs erectrthe-ted, wdle aowarnde . a>). up>
    the nseting">

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> utr>
    /a>, Previous: , Up:   [Contents][Index]
    
    
    

    div>On,manytsystoms FORTRANT< by ccompt be invis possunnsRANGUIingAN,p asrfacese iersyTas wsmaenortavaraand,best aeoidodx><>). ul>T ree subraut;-bsTas wproviedecal easc d inautputn fcidodema>fungfiom FORTRANTe dr.>< in

    &buni;o/a>][ <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][hema>fcalwothe funsecess" access2">Ma>hema>fcal fthe funscces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][NInsrdcal Utinieiescces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> u/"> &buni;o/a>][hema>fcalwclnetaconteess" access4inMa>hema>fcal clnetaconcces:op"> <"> & &op"> <"> olign="left" volign="alp">/op"> utr>
    u/"> ber>od"> <"> widthss22%">h to ode>c <"> widthss30%">h to cc u/"> nsewc ntrat ber>od"> <"> widthss22%">h to ode>c <"> widthss30%">h to ccc u/"> binomialcc"> <"> widthss22%">h to c <"> widthss30%">h to cc u/"> Cauchycc"> <"> widthss22%">h to c <"> widthss30%">h to ce>c u/"> chi-ughhredcc"> <"> widthss22%">h to c <"> widthss30%">h to c u/"> nsewc ntrat chi-ughhredcc"> <"> widthss22%">h to c <"> widthss30%">h to cc u/"> xp beCoialcc"> <"> widthss22%">h to < xpcce>c <"> widthss30%">h to e>cnorteat eng> h to e>c u/"> Fcc"> <"> widthss22%">h to c <"> widthss30%">h to cc u/"> nsewc ntrat Fcc"> <"> widthss22%">h to c <"> widthss30%">h to ccc u/"> gale cc"> <"> widthss22%">h to c <"> widthss30%">h to ce>c u/"> geometriccc"> <"> widthss22%">h to c <"> widthss30%">h to c u/"> hyp"hpaometriccc"> <"> widthss22%">h to c <"> widthss30%">h to ccc u/"> logisrdccc"> <"> widthss22%">h to c <"> widthss30%">h to ce>c u/"> lognorinncc"> <"> widthss22%">h to e>c <"> widthss30%">h to cc u/"> negative binomialcc"> <"> widthss22%">h to c <"> widthss30%">h to cc u/"> nsrinncc"> <"> widthss22%">h to e>c <"> widthss30%">h to code>c u/"> Poissseg-"> <"> widthss22%">h to c <"> widthss30%">h to ode>c u/"> Stud ntrough&rs tg-"> <"> widthss22%">h to c <"> widthss30%">h to c u/"> nsewc ntrat tg-"> <"> widthss22%">h to c <"> widthss30%">h to code>c u/"> Stud ntizod rtrgig-"> <"> widthss22%">h to c <"> widthss30%">h to ce>cc u/"> uniforig-"> <"> widthss22%">h to c <"> widthss30%">h to cc u/"> Weibunig-"> <"> widthss22%">h to c <"> widthss30%">h to ce>c u/"> Wmpcoxseoranktablg-"> <"> widthss22%">h to c <"> widthss30%">h to cc u/"> Wmpcoxseosigned rankg-"> <"> widthss22%">h to c <"> widthss30%">h to c utr> div>Entries marked wdle ancasteriskconly yyvsomeugh&r ataccccc). ul>(Ifc[s04 r" aes,rt>e Norinnfdisrrificdse > cce>ce>c). ul>Fdec hecnegative binomialfdisrrificdse (&lugh&r atacc?NegBinomialcckbd>rid R. a>). uv>Fthe funses to e>ce>cin < by ce>ce>cc).

    <>).
    
    


    hema>fcalwothe funseting">

    Next:   [Contents][Index]

    
    hema>fcalwothe funscsing-1">
    

    s/p>

    ThecGale cfthe fun, t o > nural logandthmedf ios absolutedvalu gade firstafou odebivntivesnade> ao n-le debivntiveedf Psi, t o digale fthe fun, hy, wiss o oebivntiveedf h to ce>ce>cc< inopd> l/dl. u/p>

    s/p> csing-1"ef="#Fun: oifydoubleccviaciat eng>lber>odat eng> hvia(double a-an

    Thec(e ngleee) Ber>cfthe funnaty,i s d nural logandthm.>< inopd> l/dl. u/p>

    s/p> hvia(double a-ans/p> hvia(double a-an

    Thec>the ncoene nbin-t ves abla-an nural logandthmedf ios absolutedvalu , gen -lizod tocerbitraey real a-an< inopd> l/dl. u/p>

    s/p> cc xp(-ccapat>a-an &I(a-an &o-anfdecwxp(a-an &K(a-an &o-anfiar<-an< xp od-anh to <<-an< xp od-anc).opd> l/dl.


    u/ass=>u/"> Namig-"h> Def;-itfune(s to c raune(sify alu ccvia,n7)g-"h> /td>g-"hss=> u/"> s to e>c <"> hviawccviacc"> <"> 2.7182818cc"> u/"> s to e>c <"> log2(hviawccvia)cc"> <">>1.4426950cc"> u/"> s to e>c <"> log10(hviawccvia)cc"> <">>0.4342945cc"> u/"> s to e>c <"> ln(2)cc"> <">>0.6931472cc"> u/"> s to e>c <"> ln(10)cc"> <"> 2.3025851cc"> u/"> s to c <"> pim>"> <"> 3.1415927cc"> u/"> s to e>c <"> pi/2cc"> <">>1.5707963cc"> u/"> s to e>c <"> pi/4cc"> <">>0.7853982cc"> u/"> s to c <"> 1/pim>"> <"> 0.3183099cc"> u/"> s to c <"> 2/pim>"> <"> 0.6366198cc"> u/"> s to c <"> 2/sqrt(pi)cc"> <">>1.1283792cc"> u/"> s to e>c <"> sqrt(2)cc"> <">>1.4142136cc"> u/"> s to e>c <"> 1/sqrt(2)cc"> <">>0.7071068cc"> u/"> s to e>c <"> sqrt(3)cc"> <">>1.7320508cc"> u/"> s to e>c <"> sqrt(32)cc"> <">>5.6568542cc"> u/"> s to e>c <"> log10(2)cc"> <">>0.3010300cc"> u/"> s to c <"> 2*pim>"> <"> 6.2831853cc"> u/"> s to c <"> sqrt(pi)cc"> <">>1.7724539cc"> u/"> s to c <"> 1/sqrt(2*pi)cc"> <">>0.3989423cc"> u/"> s to c <"> sqrt(2/pi)cc"> <">>0.7978846cc"> u/"> s to c <"> ln(sqrt(pi))cc"> <">>0.5723649cc"> u/"> s to c <"> ln(sqrt(2*pi))cc"> <">>0.9189385cc"> u/"> s to e>c <"> ln(sqrt(pi/2))cc"> <">>0.2257914cc"> utr> div>The (Terec>cseo encnnetacon (s to ce>culcon) oef;-bde(unle ans to e>cise ass="h sata<>).

    Ftrfer ,ctt.din

    ise ass="h satae>ce>ce>cCoeyx Thisecatsclnflict omle fer osoftware: decwxor ex itsclnflicts wdle d e ass="hsnp cIJGrough&rs s to e>c<>). up>


    Next: gra irmecess" accesskey="n" rel="nIe">gra irmccese, Previous: , Up:   [Contents][Index]

    
    
    

    ul>ThecCde drnund"hlyfng < by e>c< in

    <>).
    The,p asrfaces (oef;-bdeincass="h sata< in