{"id":40758,"date":"2016-05-19T16:38:36","date_gmt":"2016-05-19T23:38:36","guid":{"rendered":"http:\/\/www.bruceclay.com\/blog\/?p=40758"},"modified":"2023-03-30T11:35:28","modified_gmt":"2023-03-30T18:35:28","slug":"conduct-solid-data-driven-conversion-research-convcon","status":"publish","type":"post","link":"https:\/\/www.bruceclay.com\/blog\/conduct-solid-data-driven-conversion-research-convcon\/","title":{"rendered":"How to Conduct Solid, Data-Driven Conversion Research #ConvCon"},"content":{"rendered":"
\u201cIf I had an hour to save the world, I\u2019d spend 55 minutes identifying the problem and 5 minutes implementing the solution.\u201d \u2014 Albert Einstein<\/em><\/p>\n You\u2019re tuned in to Conversion Conference 2016 and a presentation by Michael Aagaard<\/a> of Unbounce. He loves that quote by Einstein because it relates to CRO. The story he\u2019s going to tell today is about how we can change our mindset to just straight testing and broadening it to understanding the problem. <\/strong><\/p>\n <\/a><\/p>\n He starts us off viewing a landing page with lead capture form. Being a conversion optimizer, he wanted to optimize the page. He removed three of the fields on what he\u2019d call a monster form. The result was 14% lower conversions. Ouch! So next he went looking at where the drop off occurs on the form. He found which form fields had low interaction and high drop-off and addressed them by rearranging the order of the fields (putting ones that were a low commitment higher up) and tweaked label copy.<\/p>\n <\/a><\/p>\n This time they got 19% increase in conversions.<\/p>\n The question: why didn\u2019t he do the research right away and why did who jump to best practices?<\/strong><\/p>\n It\u2019s very difficult to understand a problem that you don\u2019t understand. Vice versa, it\u2019s easy to solve a problem when you understand it.<\/strong><\/p>\n He asked other conversion optimizers what keeps them from doing conversion research:<\/p>\n Split testing is not an excuse to skip your homework.<\/p>\n \u2026 to conduct better research, better hypotheses, get better results.<\/p>\n <\/a><\/p>\n Same with ecommerce.<\/p>\n There\u2019s a custom report in GA that he wrote and we might be able to get it later.<\/p>\n There\u2019s a conflict in CROs.<\/p>\n For everyday ninja analysis, feedback polls are cool, unobtrusive, and you just ask one questions. But you can do them wrong. A question like \u201cdid you find what you were looking for today\u201d and then a scale of 1 to 10 is bad. Start with the question \u201cwhat were you looking for\u201d and then \u201cdid you find it.” What does it mean if 50% of people choose 4? That data is useless.<\/p>\n His tip is to lower the perceived time investment of filling out the poll with clever formatting.<\/p>\n <\/a><\/p>\n The person will click on \u201cyes\u201d or \u201cno\u201d and then the form will change to let them type in the reason why.<\/p>\n These are the questions to ask them:<\/p>\n Here\u2019s the tool: http:\/\/fivesecondtest.com\/<\/a>. You give a user a screenshot to view for five seconds and then ask, \u201cWhat do you think this page was about?\u201d He showed users an Unbounce page with an employee of theirs on the page. Yes, we think people on pages is good for conversions. But when they showed that page to five-second testers, no one knew what the page was about, and some even said they were distracted by the image.<\/p>\n Before you can call a test trustworthy, you need statistical significance. There\u2019s a very fascinating set of calculations he does. Look for a simple size and test duration calculator. Unbounce.com has one A\/B Test Duration & Sample Size Calculator.<\/p>\n You need to know some things before you can make a hypothesis:<\/p>\n Here\u2019s a mad-libs style hypothesis exercise you can fill out for your hypothesis:<\/p>\n Because ________, we expect that ________ will cause ________. We’ll measure this using ________. We expect to see reliable results in ________.<\/p>\n It\u2019s all about seeing through the eyes of your users. Data driven empathy is what it\u2019s about. <\/strong>He gives credit to Andy Crestodina, sitting behind me, for that phrase.<\/p>\n The reasons that you have<\/em> to do conversion research:<\/p>\n Be like Einstein: prioritize understanding the problem before you start your testing. Always be aware of bias and be critical of data (you can make it say whatever you want if you torture it enough). Also, split testing is only a tool.<\/p>\n Subscribe to the blog to get all the news coming out of Conversion Conference 2016!<\/strong><\/p>\n <\/a><\/p>\n","protected":false},"excerpt":{"rendered":" You\u2019re tuned in to Conversion Conference 2016 and a presentation by Michael Aagaard of Unbounce. He opens with a quote from Albert Einstein: \u201cIf I had an hour to save the world, I\u2019d spend 55 minutes identifying the problem and 5 minutes implementing the solution.\u201d <\/p>\n Aagaard loves that quote because it relates to CRO. The story he\u2019s going to tell today is about how we can change our mindset to just straight testing and broadening it to understanding the problem. Read How to Conduct Solid, Data-Driven Conversion Research<\/a>.<\/p>\n","protected":false},"author":38,"featured_media":40764,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"content-type":"","footnotes":""},"categories":[1],"tags":[1326,398,39],"acf":[],"yoast_head":"\n
6 Things You Can Do Right Away<\/h2>\n
1. Manual step-drop analysis with Google Analytics.<\/h3>\n
2. Run feedback polls on critical pages.<\/h3>\n
\n
3. Conduct interviews with sales and support.<\/h3>\n
\n
4. Perform 5-second tests.<\/h3>\n
5. Calculate your sample size and test duration.<\/h3>\n
6. Formulate a data-driven test hypothesis.<\/h3>\n
\n
\n
Final Thought<\/h2>\n
\n