Tables in R (And How to Export Them to Word) (2024)

How to export tables from R depends on what word processor you use. This tutorial focuses on Word. If you use LaTeX, there are many existing R packages and tutorials that will get you started, including xtable and stargazer.

To export tables to Word, follow these general steps:

  1. Create a table or data.frame in R.
  2. Write this table to a comma-separated .txt file using write.table().
  3. Copy and paste the content of the .txt file into Word.
  4. In Word,
    1. select the text you just pasted from the .txt file
    2. go to Table Convert Convert Text to Table…
    3. make sure “Commas” is selected under “Separate text at”, click OK

You’ll now have a basic table that you can format in Word. Below are three examples of how to use this process to create crosstabs, tables for summary statistics, and regression tables.

Data and Packages

Before we get started, read in a dataset on U.S. states (codebook here) into R:

states <- read.csv("states.csv")

Also install and load packages dplyr, tidyr, and broom:

pkgs <- c("dplyr", "tidyr", "broom")install.packages(pkgs) #install sapply(pkgs, require, character.only = T) #load 
## dplyr tidyr broom## TRUE TRUE TRUE

Create a table showing the proportion of states that supported Bush in 2000, by region (South versus Non-South):

# Create tablet <- with(states, table(south, gb_win00))t <- prop.table(t, margin = 1)t #large majority of southern states supported Bush in 2000: 
## gb_win00## south Bush win Gore win## Nonsouth 0.4705882 0.5294118## South 0.8750000 0.1250000
# Write this table to a comma separated .txt file: write.table(t, file = "bush_south.txt", sep = ",", quote = FALSE, row.names = F)

The .txt file will end up in your working directory. Now follow steps 3 and 4 in the Overview section above to create the crosstab in Word.

Here’s another example that again uses the states.csv dataset. Say we wanted to create a table with summary statistics for five of the variables in this dataset:

sumstat <- states %>% # Select and rename five variables  select( `Black (%)` = blkpct, `Attend church (%)` = attend_pct, `Supported Bush in 2000 (%)` = bush00, `Supported Obama in 2008 (%)` = obama08, `Women in State Legislature (%)` = womleg ) %>% # Find the mean, st. dev., min, and max for each variable  summarise_each(funs(mean, sd, min, max)) %>% # Move summary stats to columns gather(key, value, everything()) %>%  separate(key, into = c("variable", "stat"), sep = "_") %>% spread(stat, value) %>% # Set order of summary statistics  select(variable, mean, sd, min, max) %>% # Round all numeric variables to one decimal point mutate_each(funs(round(., 1)), -variable)sumstat
## variable mean sd min max## 1 Attend church (%) 38.9 9.4 22.0 60.0## 2 Black (%) 10.3 9.7 0.4 36.8## 3 Supported Bush in 2000 (%) 50.4 8.7 31.9 67.8## 4 Supported Obama in 2008 (%) 50.5 9.5 32.5 71.8## 5 Women in State Legislature (%) 23.2 7.3 8.8 37.8
# Write to .txtwrite.table(sumstat, file = "sumstats.txt", sep = ",", quote = FALSE, row.names = F)

Again, the sumstats.txt file will end up in your working directory, and you can use steps 3 and 4 from the Overview section above to import this file into Word.

Exercise

Create a table of summary statistics in Word for vep04_turnout, vep08_turnout, unemploy, urban, and hispanic. The table should include the number of observations (n), mean, median, 10th percentile, and 90th percentile of each of the variables. Put the variables in the rows of the table and the summary statistics in the columns, like we did in the example above. Format your table in Word to make it look similar to this table.

Say we wanted to run three OLS models to predict state-level support for Bush in 2000, where each model adds a predictor to the preceding model. We can create a regression table with all three models like so:

m1 <- tidy(lm(bush00 ~ blkpct, states))m2 <- tidy(lm(bush00 ~ blkpct + south, data = states))m3 <- tidy(lm(bush00 ~ blkpct + south + womleg, data = states))# Note that tidy() from the broom package is used to convert each model to a data frameall_models <- rbind_list( m1 %>% mutate(model = 1), m2 %>% mutate(model = 2), m3 %>% mutate(model = 3))all_models
## Source: local data frame [9 x 6]#### term estimate std.error statistic p.value model## (chr) (dbl) (dbl) (dbl) (dbl) (dbl)## 1 (Intercept) 50.92242670 1.8269042 27.873617 2.674311e-31 1## 2 blkpct -0.04645116 0.1295857 -0.358459 7.215717e-01 1## 3 (Intercept) 51.26374042 1.7392966 29.473834 5.855540e-32 2## 4 blkpct -0.35982687 0.1753724 -2.051788 4.578488e-02 2## 5 southSouth 9.04563705 3.6085132 2.506749 1.570125e-02 2## 6 (Intercept) 66.01152726 4.5218532 14.598335 6.949435e-19 3## 7 blkpct -0.41579167 0.1585744 -2.622061 1.181097e-02 3## 8 southSouth 6.03572838 3.3595138 1.796608 7.896601e-02 3## 9 womleg -0.56807999 0.1634218 -3.476157 1.121456e-03 3
# Now make this data frame look more like a regression table ols_table <- all_models %>% select(-statistic, -p.value) %>% mutate_each(funs(round(., 2)), -term) %>%  gather(key, value, estimate:std.error) %>% spread(model, value)ols_table
## Source: local data frame [8 x 5]#### term key 1 2 3## (chr) (chr) (dbl) (dbl) (dbl)## 1 (Intercept) estimate 50.92 51.26 66.01## 2 (Intercept) std.error 1.83 1.74 4.52## 3 blkpct estimate -0.05 -0.36 -0.42## 4 blkpct std.error 0.13 0.18 0.16## 5 southSouth estimate NA 9.05 6.04## 6 southSouth std.error NA 3.61 3.36## 7 womleg estimate NA NA -0.57## 8 womleg std.error NA NA 0.16
# Exportwrite.table(ols_table, file = "olstab.txt", sep = ",", quote = FALSE, row.names = F)

Again, follow steps 3 and 4 from the Overview section above to import the content of the .txt file into Word.

Tables in R (And How to Export Them to Word) (2024)

References

Top Articles
Fort Pierce Magic Seaweed
Is Brett Favre In A Geico Commercial With Chicken Wings
Genesis Parsippany
Best Big Jumpshot 2K23
Winston Salem Nc Craigslist
Explore Tarot: Your Ultimate Tarot Cheat Sheet for Beginners
Kansas Craigslist Free Stuff
Lesson 3 Homework Practice Measures Of Variation Answer Key
shopping.drugsourceinc.com/imperial | Imperial Health TX AZ
Craigslist Labor Gigs Albuquerque
Mini Handy 2024: Die besten Mini Smartphones | Purdroid.de
What Happened To Anna Citron Lansky
Used Sawmill For Sale - Craigslist Near Tennessee
Simpsons Tapped Out Road To Riches
Log in or sign up to view
Hellraiser III [1996] [R] - 5.8.6 | Parents' Guide & Review | Kids-In-Mind.com
Axe Throwing Milford Nh
Gopher Hockey Forum
The Blind Showtimes Near Amc Merchants Crossing 16
Gran Turismo Showtimes Near Marcus Renaissance Cinema
Holiday Gift Bearer In Egypt
R&S Auto Lockridge Iowa
1 Filmy4Wap In
Siskiyou Co Craigslist
Capital Hall 6 Base Layout
Royal Caribbean Luggage Tags Pending
EST to IST Converter - Time Zone Tool
Ixl Lausd Northwest
Garrison Blacksmith's Bench
Reading Craigslist Pa
New Gold Lee
ATM Near Me | Find The Nearest ATM Location | ATM Locator NL
Giantess Feet Deviantart
19 Best Seafood Restaurants in San Antonio - The Texas Tasty
Body Surface Area (BSA) Calculator
Housing Intranet Unt
Scarlet Maiden F95Zone
Lonely Wife Dating Club בקורות וחוות דעת משתמשים 2021
O'reilly's El Dorado Kansas
Postgraduate | Student Recruitment
Avance Primary Care Morrisville
8776725837
Funkin' on the Heights
How the Color Pink Influences Mood and Emotions: A Psychological Perspective
Samsung 9C8
Kaamel Hasaun Wikipedia
877-552-2666
Euro area international trade in goods surplus €21.2 bn
Walmart Front Door Wreaths
Mkvcinemas Movies Free Download
Fahrpläne, Preise und Anbieter von Bookaway
Naughty Natt Farting
Latest Posts
Article information

Author: Domingo Moore

Last Updated:

Views: 6025

Rating: 4.2 / 5 (73 voted)

Reviews: 80% of readers found this page helpful

Author information

Name: Domingo Moore

Birthday: 1997-05-20

Address: 6485 Kohler Route, Antonioton, VT 77375-0299

Phone: +3213869077934

Job: Sales Analyst

Hobby: Kayaking, Roller skating, Cabaret, Rugby, Homebrewing, Creative writing, amateur radio

Introduction: My name is Domingo Moore, I am a attractive, gorgeous, funny, jolly, spotless, nice, fantastic person who loves writing and wants to share my knowledge and understanding with you.