When working on any product don't be surprised to be surprised. The biggest issues in building a successful product arise when teams are not willing to look to users and conduct tests to find out what is and is not working. Designing something to work a certain way does not necessarily mean users will use it as it was originally intended.
A/B testing involves comparing two versions (version A and version B). A/B testing most often occurs with homepages or landing pages, and newsletters. Once the designer creates two options for a layout, they are are split between two different audiences, comprised of actual users. In order to be effective A/B testing needs to be tested against a large enough sample size. Once the pages have been live for a designated test period, you'll examine the analytics to determine which one performed better.
Non-profit digital agency WholeWhaleexplains A/B testing with real world examples. [9:31 minutes]
In order to implement most A/B tests you'll need the help of developers in order to implement the code to run the test. Two tools to help you run the tests are:
Optimizely (recommended to download their free e-book with case studies)
Newsletters are a good place to start because to create them you don't need to know any code. Email platforms such as MailChimp making it conducive to A/B testing. When A/B testing you only want to test or change ONE factor at a time. For newsletters, here are some different aspects you could test:
subject line
name in the "from" field
preview text
customizing the message with the senders name
content or message of the mailing
time or day sent
As with all research you want to be clear on what you're testing. You'll also want to consider other external factors like if it's a holiday weekend, as that may affect open rates. Whole Whale wrote a post on A/B testing with MailChimp with additional considerations.
A/B tests tend to result in clear winners, where one design performed better than the other. This is when the numbers and quantitative data come in handy.