Server-side A/B testing is a method of conducting experiments to compare two or more versions of a web page, feature, or user experience by implementing the variations on the server rather than in the user's browser.
This approach allows for more complex and robust testing scenarios compared to client-side A/B testing.
In server-side A/B testing:
- The server determines which version of the content or feature to show to each user.
- The entire page or feature is generated on the server before being sent to the user's browser.
- The user receives a complete version of the page or feature, unaware that they are part of an experiment.
When to use server-side A/B testing:
Testing significant changes: When you need to test major changes to your website or application that affect multiple elements or require complex logic.
Performance-critical applications: If you want to avoid any potential performance impact on the client-side, as all processing is done on the server.
Testing backend algorithms: When you need to test different algorithms, database queries, or server-side processes that can't be implemented client-side.
Personalization: For implementing and testing personalized experiences based on user data stored on the server.
Security-sensitive features: When testing features that involve sensitive data or operations that shouldn't be exposed to the client.
Cross-device consistency: To ensure a consistent experience across different devices and platforms, as the variations are controlled server-side.
Testing early in the development cycle: When you want to test new features before they're fully implemented in the frontend.
Avoiding ad-blockers: Server-side testing is less likely to be affected by ad-blockers or browser extensions that might interfere with client-side testing scripts.
Complex targeting: When you need to use complex user segmentation or targeting rules that require server-side data or processing.
Testing third-party integrations: When testing features that rely on server-side integrations with other services or APIs.
Compliance with regulations: In cases where you need to ensure compliance with data protection regulations or industry-specific requirements.
Gradual feature rollouts: For implementing feature flags or gradual rollouts of new features to specific user segments.
While server-side A/B testing offers many advantages, it can be more complex to set up and maintain compared to client-side testing. It may also require more server resources and can be less flexible for quick iterations. The choice between server-side and client-side A/B testing depends on your specific needs, technical capabilities, and the nature of the experiments you want to conduct.
How to setup server-side testing:
1. Create a new experiment and choose 'Server-side Testing'.
2. Give your campaign a name and assign an experiment key as a unique identifier for your test profile.
3. Next, choose your preferred SDK to view sample code. Copy and paste the following code to activate the experiment in your application.
4. Update the SDK code with your variables. Once you have these variables set up, you'll be able to determine which version (control or variant) a client sees and enable the appropriate design.
Public Key: This is your unique Mida project key, which you can find in your project settings where you obtained the pixel snippet.
Experiment Key: This is a custom value you create when setting up a server-side test in Mida. It's a unique identifier for your test profile. For example, you might use "new-design-test" as your experiment key.
Distinct ID: This is a unique identifier for each visitor. Since server-side testing doesn't have access to browser footprints, you'll need to use an internal system user ID or email address to distinguish between users.
5. You can then use the setEvent method to track goal completions. Copy the script and add it to an event which confirms the goal has been reached.
6. Add any further configurations to your test and finally 'Publish' it! Congratulations on being one step closer to delivering optimal user experience to your users!
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article