Anton Antonov
MathematicaVsR at GitHub
November, 2016

Introduction

This R-Markdown notebook was made for the R-part of the MathematicaVsR project “Text analysis of Trump tweets”.

The project is based in the blog post [1], and this R-notebook uses the data from [1] and provide statistics extensions or alternatives. For conclusions over those statistics see [1].

Load libraries

Here are the libraries used in this R-notebook. In addition to those in [1] the libraries “vcd” and “arules” are used.

library(plyr)
library(dplyr)
library(tidyr)
library(ggplot2)
library(lubridate)
library(arules)
library(vcd)

Getting data

We are not going to repeat the Twitter messages ingestion done in [1] – we are going to use the data frame ingestion result provided in [1].

load(url("http://varianceexplained.org/files/trump_tweets_df.rda"))
#load("./trump_tweets_df.rda")

Data wrangling – extracting source devices and adding time tags

As it is done in the blog post [1] we project and clean the data:

tweets <- trump_tweets_df %>%
  select(id, statusSource, text, created) %>%
  extract(statusSource, "source", "Twitter for (.*?)<") %>%
  filter(source %in% c("Android", "iPhone"))

Next we add time tags derived from the time-stamp column “created”. For the analysis that follows only the dates, hours, and the weekdays are needed.

tweets <- cbind( tweets, date = as.Date(tweets$created), hour = hour(with_tz(tweets$created, "EST")), weekday = weekdays(as.Date(tweets$created)) )
summary(as.data.frame(unclass(tweets)))
                  id           source   
 676494179216805888:   1   Android:762  
 676509769562251264:   1   iPhone :628  
 678442470720577537:   1                
 678446032599040001:   1                
 678490367285678081:   1                
 680492103722479616:   1                
 (Other)           :1384                
                                                                                                                                         text     
 MAKE AMERICA GREAT AGAIN!                                                                                                                 :   5  
 . #RepMikeKelly  Great job on @foxandfriends this morning. Thank you for the nice words!                                                  :   1  
 .@AC360  Anderson, so amazing. Your mother is, and always has been, an incredible woman!                                                  :   1  
 .@AndreBauer  Great job and advice on @CNN  @jaketapper  Thank you!                                                                       :   1  
 .@AnnCoulter has been amazing. We will win and establish strong borders, we will build a WALL and Mexico will pay. We will be great again!:   1  
 .@Borisep was great on @JudgeJeanine tonight. Very smart commentary that will prove to be correct!                                        :   1  
 (Other)                                                                                                                                   :1380  
    created                         date                 hour            weekday   
 Min.   :2015-12-14 20:09:15   Min.   :2015-12-14   Min.   : 0.00   Friday   :200  
 1st Qu.:2016-03-20 19:59:25   1st Qu.:2016-03-20   1st Qu.: 8.00   Monday   :164  
 Median :2016-05-18 01:00:49   Median :2016-05-18   Median :14.00   Saturday :187  
 Mean   :2016-05-11 17:50:36   Mean   :2016-05-11   Mean   :13.44   Sunday   :171  
 3rd Qu.:2016-07-04 13:52:59   3rd Qu.:2016-07-04   3rd Qu.:18.00   Thursday :181  
 Max.   :2016-08-08 15:20:44   Max.   :2016-08-08   Max.   :23.00   Tuesday  :217  
                                                                    Wednesday:270  

Comparison by used words

This section demonstrates a way to derive word-device associations that is alternative to the approach in [1]. The Association rules learning algorithm Apriori is used through the package “arules”.

First we split the tweet messages into bags of words (baskets).

sres <- strsplit( iconv(tweets$text),"\\s")
sres <- llply( sres, function(x) { x <- unique(x); x[nchar(x)>2] })

The package “arules” does not work directly with lists of lists. (In this case with a list of bags or words or baskets.) We have to derive a binary incidence matrix from the bags of words.

Here we add the device tags to those bags of words and derive a long form of tweet-index and word pairs:

sresDF <- 
  ldply( 1:length(sres), function(i) {
    data.frame( index = i, word = c( tweets$source[i], sres[i][[1]]) )
  })

Next we find the contingency matrix for index vs. word:

wordsCT <- xtabs( ~ index + word, sresDF, sparse = TRUE)

At this point we can use the Apriori algorithm of the package:

rulesRes <- apriori( as.matrix(wordsCT), parameter = list(supp = 0.01, conf = 0.6, maxlen = 2, target = "rules"))
Apriori

Parameter specification:
 confidence minval smax arem  aval originalSupport maxtime support minlen maxlen target   ext
        0.6    0.1    1 none FALSE            TRUE       5    0.01      1      2  rules FALSE

Algorithmic control:
 filter tree heap memopt load sort verbose
    0.1 TRUE TRUE  FALSE TRUE    2    TRUE

Absolute minimum support count: 13 

set item appearances ...[0 item(s)] done [0.00s].
set transactions ...[6572 item(s), 1390 transaction(s)] done [0.00s].
sorting and recoding items ... [184 item(s)] done [0.00s].
creating transaction tree ... done [0.00s].
checking subsets of size 1 2
Mining stopped (maxlen reached). Only patterns up to a length of 2 returned!
 done [0.00s].
writing ... [171 rule(s)] done [0.00s].
creating S4 object  ... done [0.00s].

Here are association rules for “Android” sorted by confidence in descending order:

inspect( subset( sort(rulesRes, by="confidence"), subset = rhs %in% "Android" & confidence > 0.78) )
     lhs                   rhs       support    confidence lift    
[1]  {A.M.}             => {Android} 0.01007194 1.0000000  1.824147
[2]  {@megynkelly}      => {Android} 0.01726619 1.0000000  1.824147
[3]  {@realDonaldTrump} => {Android} 0.08057554 0.9911504  1.808004
[4]  {Wow,}             => {Android} 0.01510791 0.9545455  1.741231
[5]  {time}             => {Android} 0.01366906 0.9500000  1.732940
[6]  {done}             => {Android} 0.01223022 0.9444444  1.722805
[7]  {over}             => {Android} 0.01079137 0.9375000  1.710138
[8]  {president}        => {Android} 0.01007194 0.9333333  1.702537
[9]  {because}          => {Android} 0.01870504 0.9285714  1.693851
[10] {@CNN}             => {Android} 0.01726619 0.9230769  1.683828
[11] {were}             => {Android} 0.01510791 0.9130435  1.665526
[12] {beat}             => {Android} 0.01366906 0.9047619  1.650419
[13] {U.S.}             => {Android} 0.01294964 0.9000000  1.641732
[14] {win}              => {Android} 0.01870504 0.8965517  1.635442
[15] {big}              => {Android} 0.01798561 0.8928571  1.628703
[16] {against}          => {Android} 0.01798561 0.8928571  1.628703
[17] {said}             => {Android} 0.02230216 0.8857143  1.615673
[18] {made}             => {Android} 0.01079137 0.8823529  1.609541
[19] {won}              => {Android} 0.01007194 0.8750000  1.596129
[20] {being}            => {Android} 0.01007194 0.8750000  1.596129
[21] {country}          => {Android} 0.01510791 0.8750000  1.596129
[22] {had}              => {Android} 0.01942446 0.8709677  1.588773
[23] {job}              => {Android} 0.01438849 0.8695652  1.586215
[24] {Republican}       => {Android} 0.02302158 0.8648649  1.577641
[25] {than}             => {Android} 0.02230216 0.8611111  1.570793
[26] {@nytimes}         => {Android} 0.01294964 0.8571429  1.563555
[27] {media}            => {Android} 0.02158273 0.8571429  1.563555
[28] {vote}             => {Android} 0.01654676 0.8518519  1.553903
[29] {You}              => {Android} 0.01223022 0.8500000  1.550525
[30] {more}             => {Android} 0.02446043 0.8500000  1.550525
[31] {jobs}             => {Android} 0.01079137 0.8333333  1.520122
[32] {but}              => {Android} 0.03165468 0.8301887  1.514386
[33] {would}            => {Android} 0.02733813 0.8260870  1.506904
[34] {very}             => {Android} 0.03381295 0.8245614  1.504121
[35] {America}          => {Android} 0.01007194 0.8235294  1.502239
[36] {got}              => {Android} 0.01007194 0.8235294  1.502239
[37] {ever}             => {Android} 0.01294964 0.8181818  1.492484
[38] {total}            => {Android} 0.01294964 0.8181818  1.492484
[39] {Sanders}          => {Android} 0.01582734 0.8148148  1.486342
[40] {totally}          => {Android} 0.01870504 0.8125000  1.482119
[41] {@FoxNews}         => {Android} 0.01798561 0.8064516  1.471086
[42] {Bernie}           => {Android} 0.02374101 0.8048780  1.468216
[43] {Trump}            => {Android} 0.04388489 0.8026316  1.464118
[44] {are}              => {Android} 0.06402878 0.8018018  1.462604
[45] {that}             => {Android} 0.08561151 0.7986577  1.456869
[46] {Ted}              => {Android} 0.02517986 0.7954545  1.451026
[47] {what}             => {Android} 0.01654676 0.7931034  1.446737
[48] {wants}            => {Android} 0.01079137 0.7894737  1.440116
[49] {just}             => {Android} 0.03237410 0.7894737  1.440116
[50] {much}             => {Android} 0.01582734 0.7857143  1.433258

And here are association rules for “iPhone” sorted by confidence in descending order:

iphRules <- inspect( subset( sort(rulesRes, by="confidence"), subset = rhs %in% "iPhone" & support > 0.01) )
     lhs                         rhs      support    confidence lift    
[1]  {#TrumpPence16}          => {iPhone} 0.01007194 1.0000000  2.213376
[2]  {THANK}                  => {iPhone} 0.01223022 1.0000000  2.213376
[3]  {#ImWithYou}             => {iPhone} 0.01366906 1.0000000  2.213376
[4]  {#VoteTrump}             => {iPhone} 0.01582734 1.0000000  2.213376
[5]  {#AmericaFirst}          => {iPhone} 0.01942446 1.0000000  2.213376
[6]  {Join}                   => {iPhone} 0.02733813 1.0000000  2.213376
[7]  {#Trump2016}             => {iPhone} 0.12302158 0.9500000  2.102707
[8]  {#CrookedHillary}        => {iPhone} 0.01151079 0.9411765  2.083177
[9]  {soon!}                  => {iPhone} 0.01151079 0.9411765  2.083177
[10] {#MakeAmericaGreatAgain} => {iPhone} 0.06546763 0.9100000  2.014172
[11] {#MAGA}                  => {iPhone} 0.01151079 0.8888889  1.967445
[12] {Thank}                  => {iPhone} 0.12086331 0.7850467  1.737603
[13] {you}                    => {iPhone} 0.11151079 0.7142857  1.580983
[14] {tonight}                => {iPhone} 0.01366906 0.6785714  1.501934
[15] {AGAIN!}                 => {iPhone} 0.01798561 0.6410256  1.418831
[16] {New}                    => {iPhone} 0.02086331 0.6304348  1.395389
[17] {you!}                   => {iPhone} 0.02446043 0.6296296  1.393607
[18] {&amp;}                  => {iPhone} 0.03669065 0.6219512  1.376612

Generally speaking, the package “arules” is somewhat awkward to use. For example, extracting the words of the column “lhs” would require some wrangling:

ws <- as.character(unclass(as.character(iphRules$lhs)))
gsub(pattern = "\\{|\\}", "", ws)
 [1] "#TrumpPence16"          "THANK"                  "#ImWithYou"             "#VoteTrump"             "#AmericaFirst"          "Join"                  
 [7] "#Trump2016"             "#CrookedHillary"        "soon!"                  "#MakeAmericaGreatAgain" "#MAGA"                  "Thank"                 
[13] "you"                    "tonight"                "AGAIN!"                 "New"                    "you!"                   "&amp;"                 

References

[1] David Robinson, “Text analysis of Trump’s tweets confirms he writes only the (angrier) Android half”, (2016), VarianceExplained.org.

LS0tCnRpdGxlOiAiVGV4dCBhbmFseXNpcyBvZiBUcnVtcCB0d2VldHMiCm91dHB1dDogaHRtbF9ub3RlYm9vawotLS0KCkFudG9uIEFudG9ub3YgICAKW01hdGhlbWF0aWNhVnNSIGF0IEdpdEh1Yl0oaHR0cHM6Ly9naXRodWIuY29tL2FudG9ub25jdWJlL01hdGhlbWF0aWNhVnNSKSAgIApOb3ZlbWJlciwgMjAxNgoKIyBJbnRyb2R1Y3Rpb24KClRoaXMgUi1NYXJrZG93biBub3RlYm9vayB3YXMgbWFkZSBmb3IgdGhlIFItcGFydCBvZiB0aGUgW01hdGhlbWF0aWNhVnNSXShodHRwczovL2dpdGh1Yi5jb20vYW50b25vbmN1YmUvTWF0aGVtYXRpY2FWc1IpIHByb2plY3QgWyJUZXh0IGFuYWx5c2lzIG9mIFRydW1wIHR3ZWV0cyJdKGh0dHBzOi8vZ2l0aHViLmNvbS9hbnRvbm9uY3ViZS9NYXRoZW1hdGljYVZzUi90cmVlL21hc3Rlci9Qcm9qZWN0cy9UZXh0QW5hbHlzaXNPZlRydW1wVHdlZXRzKS4KClRoZSBwcm9qZWN0IGlzIGJhc2VkIGluIHRoZSBibG9nIHBvc3QgWzFdLCBhbmQgdGhpcyBSLW5vdGVib29rIHVzZXMgdGhlIGRhdGEgZnJvbSBbMV0gYW5kIHByb3ZpZGUgc3RhdGlzdGljcyBleHRlbnNpb25zIG9yIGFsdGVybmF0aXZlcy4gRm9yIGNvbmNsdXNpb25zIG92ZXIgdGhvc2Ugc3RhdGlzdGljcyBzZWUgWzFdLgoKIyBMb2FkIGxpYnJhcmllcwoKSGVyZSBhcmUgdGhlIGxpYnJhcmllcyB1c2VkIGluIHRoaXMgUi1ub3RlYm9vay4gSW4gYWRkaXRpb24gdG8gdGhvc2UgaW4gWzFdIHRoZSBsaWJyYXJpZXMgInZjZCIgYW5kICJhcnVsZXMiIGFyZSB1c2VkLgoKYGBge3J9CmxpYnJhcnkocGx5cikKbGlicmFyeShkcGx5cikKbGlicmFyeSh0aWR5cikKbGlicmFyeShnZ3Bsb3QyKQpsaWJyYXJ5KGx1YnJpZGF0ZSkKbGlicmFyeSh2Y2QpCmxpYnJhcnkoYXJ1bGVzKQpgYGAKCiMgR2V0dGluZyBkYXRhCgpXZSBhcmUgbm90IGdvaW5nIHRvIHJlcGVhdCB0aGUgVHdpdHRlciBtZXNzYWdlcyBpbmdlc3Rpb24gZG9uZSBpbiBbMV0gLS0gd2UgYXJlIGdvaW5nIHRvIHVzZSB0aGUgZGF0YSBmcmFtZSBpbmdlc3Rpb24gcmVzdWx0IHByb3ZpZGVkIGluIFsxXS4KCmBgYHtyfQpsb2FkKHVybCgiaHR0cDovL3ZhcmlhbmNlZXhwbGFpbmVkLm9yZy9maWxlcy90cnVtcF90d2VldHNfZGYucmRhIikpCiNsb2FkKCIuL3RydW1wX3R3ZWV0c19kZi5yZGEiKQpgYGAKCiMgRGF0YSB3cmFuZ2xpbmcgLS0gZXh0cmFjdGluZyBzb3VyY2UgZGV2aWNlcyBhbmQgYWRkaW5nIHRpbWUgdGFncwoKQXMgaXQgaXMgZG9uZSBpbiB0aGUgYmxvZyBwb3N0IFsxXSB3ZSBwcm9qZWN0IGFuZCBjbGVhbiB0aGUgZGF0YToKCmBgYHtyfQp0d2VldHMgPC0gdHJ1bXBfdHdlZXRzX2RmICU+JQogIHNlbGVjdChpZCwgc3RhdHVzU291cmNlLCB0ZXh0LCBjcmVhdGVkKSAlPiUKICBleHRyYWN0KHN0YXR1c1NvdXJjZSwgInNvdXJjZSIsICJUd2l0dGVyIGZvciAoLio/KTwiKSAlPiUKICBmaWx0ZXIoc291cmNlICVpbiUgYygiQW5kcm9pZCIsICJpUGhvbmUiKSkKYGBgCgpOZXh0IHdlIGFkZCB0aW1lIHRhZ3MgZGVyaXZlZCBmcm9tIHRoZSB0aW1lLXN0YW1wIGNvbHVtbiAiY3JlYXRlZCIuIEZvciB0aGUgYW5hbHlzaXMgdGhhdCBmb2xsb3dzIG9ubHkgdGhlIGRhdGVzLCBob3VycywgYW5kIHRoZSB3ZWVrZGF5cyBhcmUgbmVlZGVkLgoKYGBge3J9CnR3ZWV0cyA8LSBjYmluZCggdHdlZXRzLCBkYXRlID0gYXMuRGF0ZSh0d2VldHMkY3JlYXRlZCksIGhvdXIgPSBob3VyKHdpdGhfdHoodHdlZXRzJGNyZWF0ZWQsICJFU1QiKSksIHdlZWtkYXkgPSB3ZWVrZGF5cyhhcy5EYXRlKHR3ZWV0cyRjcmVhdGVkKSkgKQpgYGAKCmBgYHtyfQpzdW1tYXJ5KGFzLmRhdGEuZnJhbWUodW5jbGFzcyh0d2VldHMpKSkKYGBgCgojIFRpbWUgc2VyaWVzIGFuZCB0aW1lIHJlbGF0ZWQgZGlzdHJpYnV0aW9ucwoKU2ltcGxlIHRpbWUgc2VyaWVzIHdpdGggbW92aW5nIGF2ZXJhZ2UuCgpgYGB7cn0KcWRmIDwtIGRkcGx5KCB0d2VldHMsIGMoInNvdXJjZSIsImRhdGUiKSwgZnVuY3Rpb24oeCkgeyBkYXRhLmZyYW1lKCBzb3VyY2UgPSB4JHNvdXJjZVsxXSwgZGF0ZSA9IHgkZGF0ZVsxXSwgY291bnQgPSBucm93KHgpLCBmcmFjdGlvbiA9IG5yb3coeCkgLyBucm93KHR3ZWV0cykgKSB9ICkKd2luZG93U2l6ZSA8LSA2CnFkZiA8LSAKICBkZHBseSggcWRmLCAic291cmNlIiwgZnVuY3Rpb24oeCkgeyAKICAgIHggPSB4WyBvcmRlcih4JGRhdGUpLCBdOyBjcyA8LSBjdW1zdW0oeCRmcmFjdGlvbik7IAogICAgY2JpbmQoIHhbMToobnJvdyh4KS13aW5kb3dTaXplKSxdLCBmbWEgPSAoIGNzWyh3aW5kb3dTaXplKzEpOmxlbmd0aChjcyldIC0gY3NbMToobGVuZ3RoKGNzKS13aW5kb3dTaXplKV0gKSAvIHdpbmRvd1NpemUgKSB9IAogICkKZ2dwbG90KHFkZikgKyBnZW9tX2xpbmUoIGFlcyggeCA9IGRhdGUsIHkgPSBmbWEsIGNvbG9yID0gc291cmNlICkgKSArIGxhYnMoeCA9ICJkYXRlIiwgeSA9ICIlIG9mIHR3ZWV0cyIsIGNvbG9yID0gIiIpCmBgYAoKCmBgYHtyfQpxZGYgPC0gZGRwbHkoIHR3ZWV0cywgYygic291cmNlIiwgImhvdXIiKSwgZnVuY3Rpb24oeCkgeyBkYXRhLmZyYW1lKCBzb3VyY2UgPSB4JHNvdXJjZVsxXSwgaG91ciA9IHgkaG91clsxXSwgY291bnQgPSBucm93KHgpLCBmcmFjdGlvbiA9IG5yb3coeCkgLyBucm93KHR3ZWV0cykgKSB9ICkgCmdncGxvdChxZGYpICsgZ2VvbV9saW5lKCBhZXMoIHggPSBob3VyLCB5ID0gZnJhY3Rpb24sIGNvbG9yID0gc291cmNlICkgKSArIGxhYnMoeCA9ICJIb3VyIG9mIGRheSAoRVNUKSIsIHkgPSAiJSBvZiB0d2VldHMiLCBjb2xvciA9ICIiKQpgYGAKCkF0IHRoaXMgcG9pbnQgd2UgY2FuIGFsc28gcGxvdCBhIG1vc2FpYyBwbG90IG9mIHR3ZWV0c2AgY3JlYXRpb24gaG91cnMgb3Igd2Vla2RheXMgd2l0aCByZXNwZWN0IHRvIGRldmljZSBzb3VyY2VzOgoKYGBge3J9Cm1vc2FpY3Bsb3QoIGhvdXIgfiBzb3VyY2UsIHR3ZWV0cywgZGlyID0gImgiLCBjb2xvciA9IFRSVUUgKQpgYGAKCmBgYHtyfQptb3NhaWNwbG90KCB3ZWVrZGF5IH4gc291cmNlLCB0d2VldHMsIGRpciA9ICJoIiwgY29sb3IgPSBUUlVFICkKYGBgCgoKIyBDb21wYXJpc29uIGJ5IHVzZWQgd29yZHMKClRoaXMgc2VjdGlvbiBkZW1vbnN0cmF0ZXMgYSB3YXkgdG8gZGVyaXZlIHdvcmQtZGV2aWNlIGFzc29jaWF0aW9ucyB0aGF0IGlzIGFsdGVybmF0aXZlIHRvIHRoZSBhcHByb2FjaCBpbiBbMV0uIApUaGUgW0Fzc29jaWF0aW9uIHJ1bGVzIGxlYXJuaW5nXShodHRwczovL2VuLndpa2lwZWRpYS5vcmcvd2lraS9Bc3NvY2lhdGlvbl9ydWxlX2xlYXJuaW5nKSBhbGdvcml0aG0gQXByaW9yaSBpcyB1c2VkIHRocm91Z2ggdGhlIHBhY2thZ2UgWyJhcnVsZXMiXShodHRwczovL2NyYW4uci1wcm9qZWN0Lm9yZy93ZWIvcGFja2FnZXMvYXJ1bGVzL2luZGV4Lmh0bWwpLgoKRmlyc3Qgd2Ugc3BsaXQgdGhlIHR3ZWV0IG1lc3NhZ2VzIGludG8gYmFncyBvZiB3b3JkcyAoYmFza2V0cykuCgpgYGB7cn0Kc3JlcyA8LSBzdHJzcGxpdCggaWNvbnYodHdlZXRzJHRleHQpLCJcXHMiKQpzcmVzIDwtIGxscGx5KCBzcmVzLCBmdW5jdGlvbih4KSB7IHggPC0gdW5pcXVlKHgpOyB4W25jaGFyKHgpPjJdIH0pCmBgYAoKVGhlIHBhY2thZ2UgImFydWxlcyIgZG9lcyBub3Qgd29yayBkaXJlY3RseSB3aXRoIGxpc3RzIG9mIGxpc3RzLiAoSW4gdGhpcyBjYXNlIHdpdGggYSBsaXN0IG9mIGJhZ3Mgb3Igd29yZHMgb3IgYmFza2V0cy4pCldlIGhhdmUgdG8gZGVyaXZlIGEgYmluYXJ5IGluY2lkZW5jZSBtYXRyaXggZnJvbSB0aGUgYmFncyBvZiB3b3Jkcy4KCkhlcmUgd2UgIGFkZCB0aGUgZGV2aWNlIHRhZ3MgdG8gdGhvc2UgYmFncyBvZiB3b3JkcyBhbmQgZGVyaXZlIGEgbG9uZyBmb3JtIG9mIHR3ZWV0LWluZGV4IGFuZCB3b3JkIHBhaXJzOgoKYGBge3J9CnNyZXNERiA8LSAKICBsZHBseSggMTpsZW5ndGgoc3JlcyksIGZ1bmN0aW9uKGkpIHsKICAgIGRhdGEuZnJhbWUoIGluZGV4ID0gaSwgd29yZCA9IGMoIHR3ZWV0cyRzb3VyY2VbaV0sIHNyZXNbaV1bWzFdXSkgKQogIH0pCmBgYAoKTmV4dCB3ZSBmaW5kIHRoZSBjb250aW5nZW5jeSBtYXRyaXggZm9yIGluZGV4IHZzLiB3b3JkOgpgYGB7cn0Kd29yZHNDVCA8LSB4dGFicyggfiBpbmRleCArIHdvcmQsIHNyZXNERiwgc3BhcnNlID0gVFJVRSkKYGBgCgpBdCB0aGlzIHBvaW50IHdlIGNhbiB1c2UgdGhlIEFwcmlvcmkgYWxnb3JpdGhtIG9mIHRoZSBwYWNrYWdlOgoKYGBge3J9CnJ1bGVzUmVzIDwtIGFwcmlvcmkoIGFzLm1hdHJpeCh3b3Jkc0NUKSwgcGFyYW1ldGVyID0gbGlzdChzdXBwID0gMC4wMSwgY29uZiA9IDAuNiwgbWF4bGVuID0gMiwgdGFyZ2V0ID0gInJ1bGVzIikpCmBgYAoKSGVyZSBhcmUgYXNzb2NpYXRpb24gcnVsZXMgZm9yICJBbmRyb2lkIiBzb3J0ZWQgYnkgY29uZmlkZW5jZSBpbiBkZXNjZW5kaW5nIG9yZGVyOgoKYGBge3J9Cmluc3BlY3QoIHN1YnNldCggc29ydChydWxlc1JlcywgYnk9ImNvbmZpZGVuY2UiKSwgc3Vic2V0ID0gcmhzICVpbiUgIkFuZHJvaWQiICYgY29uZmlkZW5jZSA+IDAuNzgpICkKYGBgCgpBbmQgaGVyZSBhcmUgYXNzb2NpYXRpb24gcnVsZXMgZm9yICJpUGhvbmUiIHNvcnRlZCBieSBjb25maWRlbmNlIGluIGRlc2NlbmRpbmcgb3JkZXI6CgpgYGB7cn0KaXBoUnVsZXMgPC0gaW5zcGVjdCggc3Vic2V0KCBzb3J0KHJ1bGVzUmVzLCBieT0iY29uZmlkZW5jZSIpLCBzdWJzZXQgPSByaHMgJWluJSAiaVBob25lIiAmIHN1cHBvcnQgPiAwLjAxKSApCmBgYAoKR2VuZXJhbGx5IHNwZWFraW5nLCB0aGUgcGFja2FnZSAiYXJ1bGVzIiBpcyBzb21ld2hhdCBhd2t3YXJkIHRvIHVzZS4gRm9yIGV4YW1wbGUsIGV4dHJhY3RpbmcgdGhlIHdvcmRzIG9mIHRoZSBjb2x1bW4gImxocyIgd291bGQgcmVxdWlyZSBzb21lIHdyYW5nbGluZzoKCmBgYHtyfQp3cyA8LSBhcy5jaGFyYWN0ZXIodW5jbGFzcyhhcy5jaGFyYWN0ZXIoaXBoUnVsZXMkbGhzKSkpCmdzdWIocGF0dGVybiA9ICJcXHt8XFx9IiwgIiIsIHdzKQpgYGAKCiMgUmVmZXJlbmNlcwoKWzFdIERhdmlkIFJvYmluc29uLCBbIlRleHQgYW5hbHlzaXMgb2YgVHJ1bXAncyB0d2VldHMgY29uZmlybXMgaGUgd3JpdGVzIG9ubHkgdGhlIChhbmdyaWVyKSBBbmRyb2lkIGhhbGYiXShodHRwOi8vdmFyaWFuY2VleHBsYWluZWQub3JnL3IvdHJ1bXAtdHdlZXRzLyksICgyMDE2KSwgW1ZhcmlhbmNlRXhwbGFpbmVkLm9yZ10oaHR0cDovL3ZhcmlhbmNlZXhwbGFpbmVkLm9yZykuCgo=