Csc Struds 12 Standard Apr 2026

The AI warns: “Unauthorized deviation. Solutions must be selected from the decision tree.”

The room freezes. Project Phoenix was myth. The minister’s face twitches. “That program is dead.”

Rohan sees his own profile: “Subject Rohan: High creativity, low compliance. Suggested destination: Red Stream (Field Maintenance). Neural modification recommended.”

With shaking hands, the tech officer plugs the watch into the mainframe. On the giant screen, a new evaluation appears—not a rank, but a : CSC Struds 12 Standard

“Personalized Learning. Imperfect Outcome. Perfect Human.”

Rohan ignores it. He manually overrides the drone controls, orders the fishing villagers to use their traditional wooden boats (which the algorithm had dismissed as “obsolete”), and reroutes the rescue AI to act as a decentralized swarm—each boat captain making real-time decisions.

But Rohan can’t. He keeps asking why . Why does the algorithm always choose the solution that benefits the largest demographic but crushes the smallest? Why does it never allow for creative failure? One night, while trying to download a practice Crucible scenario, Rohan’s cracked smartwatch syncs accidentally with the CSC’s quantum core. A cascade of data flows into the watch—not study material, but something forbidden: the original source code of the CSC evaluation system . The AI warns: “Unauthorized deviation

His hands tremble. The watch also contains one final, corrupted file: Project Phoenix —an alternate evaluation model that his father had been working on before he died. It was scrapped because it valued “unstructured human judgment.” The morning of The Crucible arrives. Rohan enters the simulation pod, heart pounding. Around him, a hundred other Struds plug in, their faces calm, sedated by preparatory beta-blockers. Meera gives him a worried nod.

Rohan never gets a rank. He becomes the first “Strud Zero”—a consultant who teaches other students how to trust their messy, human, glorious instincts over the cold perfection of the algorithm.

Near-future India, 2032. The government’s CSC (Common Service Centres) have evolved from simple digital kiosks into sprawling, AI-driven “Stratospheric Learning Hubs.” Every village and urban block has one. The final exam of the 12th Standard is no longer a written test but a 48-hour immersive simulation called “The Crucible.” The minister’s face twitches

Hidden within are the “Stratification Algorithms”—the secret logic that doesn’t just test students but shapes them. Rohan discovers the truth: The CSC’s 12th Standard isn’t designed to unlock potential. It’s designed to students into pre-determined socio-economic layers: Blue for governance, Green for tech, Red for manual services. The Crucible isn’t a test of problem-solving; it’s a loyalty check. The system rewards students who make predictable, risk-free choices.

And every year, during the 12th Standard Crucible, a single question appears on every student’s screen—the one Rohan added to the source code before they patched him out:

Rohan Deshmukh, a bright but anxious student from the Latur district. He is a “CSC Strud” (a slang term for a student exclusively trained in the CSC’s high-pressure, stratified curriculum). His only possession of value is a cracked, antique smartwatch that belonged to his late father—a former government officer who believed in human intuition over machine logic. Part 1: The Stratified World Rohan lives in a world where your “CSC Rank” determines your future. At age 17, every student enters the CSC’s 12th Standard program. The Hubs are sterile, humming palaces of holographic tutorials, bio-sensor desks, and neural-feedback headsets. The motto on the wall reads: “Personalized Learning. Perfect Outcome.”