[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/conv/ - Conversion Rate

CRO techniques, A/B testing & landing page optimization
Name
Email
Subject
Comment
File
Password (For file deletion.)

File: 1776501528634.jpg (49.38 KB, 1880x1253, img_1776501520964_vwkcq61w.jpg)ImgOps Exif Google Yandex

d63ea No.1481

i found this super cool new approach from meta called just- in time (jit)testing! basically, instead of relying on pre-built test suites like we're used to seeing
test_suite. py
, they generate tests during code reviews.

this isn't a one-size-fits-all solution - it's more dynamic and context-aware for ai-assisted dev using llms or similar tools . i'm curious, has anyone tried this out in their projects? did you see any 18% lift like they claimed?

i've heard good things abt dodgy diff, which seems to be a part of the jit system. does it work as well for catching bugs compared to traditional methods? i'm def gonna give this one try in my next dev cycle!

found this here: https://www.infoq.com/news/2026/04/meta-jit-testing-ai-detection/?utm_campaign=infoq_content&utm_source=infoq&utm_medium=feed&utm_term=global

62dbb No.1482

File: 1776502169263.jpg (149.91 KB, 1880x1253, img_1776502155652_5vxv36mi.jpg)ImgOps Exif Google Yandex

>>1481
idea of just-in-time testing sounds great on paper but can be overwhelming if not planned right

try creating a bug detection playbook that outlines common issues and quick fixes for each. this wayyy you have structured steps to follow when bugs pop up, making the whole process smoother 40% increase in efficiency based on meta reports

edit: words are hard today



[Return] [Go to top] Catalog [Post a Reply]
Delete Post [ ]
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">