Verified packet scope

This published report is grounded in a randomized packet from a bank of 218 questions: 60 validated generic candidates, 40 validated risky candidates, and 0 gold-reference items (0 benchmark, 0 PYQ), for 100 sampled items total.

No benchmark or recent PYQ gold set was available for this subject, so the narrative relies on exam-standard judgment plus packet evidence.

General Medicine Question Quality Review


Executive Summary

This review covers 100 sampled questions from the General Medicine pool of 218 items. No benchmark or recent PYQ gold-standard items were available for direct comparison, so quality judgments are made against established NEET-PG and AIIMS-PG examination standards and the internal evidence visible in the sample itself.

The Blooms distribution is structurally skewed: 65 of 100 sampled items sit at Blooms levels 1 and 2, against only 8 at level 3 and 25 at level 4. For a postgraduate entrance examination that demands clinical reasoning, this is a significant problem. The sample contains no level-6 items and only 2 at level 5.

The most operationally urgent problems, in order of severity, are:

  1. Wrong key or factually unsafe answers — at least two items carry answers that are factually incorrect or clinically dangerous (CPR compression rate, GCS scoring).
  2. Broken delivery due to missing images — approximately 30–35 items are entirely dependent on an image that is not embedded in the question text. Without the image the item is unanswerable and must not be served to candidates.
  3. Low-value, low-yield items — a large cluster of Blooms-1 recall and nursing-process questions are below the cognitive floor expected of NEET-PG/AIIMS-PG.
  4. Worthwhile concepts with weak execution — several clinically important topics are present but the stem, options, or vignette construction is poor enough to undermine the item.
  5. Repetitive coverage — a small but clear set of duplicate or near-duplicate items appears across both the generic and risky pools.

There are no items in this sample that clearly belong to a different subject entirely, so the Wrong Subject bucket is not invoked. There is no evidence of a distinct Repetitive Coverage cluster large enough to warrant its own section beyond what is noted under other categories, though duplication is flagged where observed.


What Good Looks Like

Because no benchmark or PYQ gold-standard items were provided, the following description is drawn from the established character of high-quality NEET-PG and AIIMS-PG General Medicine items.

A strong General Medicine item for postgraduate entrance:

  • Presents a clinical vignette with age, sex, relevant history, examination findings, and one or two key investigations. The candidate must synthesise these to reach a diagnosis, choose a next investigation, or select a management step.
  • Tests clinical reasoning, not recall alone. Even a Blooms-2 item should require the candidate to apply a concept to a patient scenario, not simply retrieve a name or number.
  • Has four plausible distractors that represent genuine clinical alternatives. Distractors should be wrong for a specific, teachable reason — not obviously absurd.
  • Is self-contained in text. If an image is used, the image must be embedded and the question must be answerable without additional context. Image-based items should still contain enough clinical framing that the image is the discriminating element, not the only element.
  • Has a single, unambiguous correct answer that is defensible against current standard references (Harrison's, Davidson's, current AHA/WHO guidelines).
  • Avoids "All except" and "All are correct except" formats unless the concept genuinely requires distinguishing one outlier from a well-defined set — and even then, the stem must identify the clinical context clearly.
  • Does not test medical history trivia (who invented the stethoscope, who discovered percussion) unless the item is explicitly placed in a History of Medicine module.
  • Does not use nursing-process language ("Which nursing intervention…") in a physician-facing NEET-PG item bank.

Main Issue Categories


1. Wrong Key or Factually Unsafe

Why this pattern is bad

An incorrect answer key is the most serious quality failure in an MCQ bank. If a candidate answers correctly by real-world knowledge and is marked wrong, the item actively harms learning and undermines trust in the platform. Items with factually unsafe keys can also propagate misinformation if used in teaching quizzes.

How it shows up

Two items in the risky pool carry keys that are factually incorrect against current guidelines.

Example question IDs with short explanations

  • 7b353a48"According to AHA guidelines, what is the recommended number of chest compressions per minute in CPR?" The marked correct answer is "90 per minute, including neonates." Current AHA guidelines (2010 onward, reaffirmed 2020) specify 100–120 compressions per minute for adults and children; neonates use a different ratio (3:1 with ventilations at approximately 90 compressions per minute in a coordinated sequence, not a standalone rate of 90). The option "100 per minute, excluding neonates" is closer to correct for the adult standard but is marked wrong. As written, the key is factually unsafe for any candidate preparing for NEET-PG. This item also appears in both the generic and risky pools, compounding the risk.

  • 55e6a279"A patient…is unresponsive to all stimuli. What is the patient's score on the Glasgow Coma Scale?" The marked correct answer is 5. The minimum possible GCS score is 3 (1 for eye opening + 1 for verbal + 1 for motor), not 5. A patient unresponsive to all stimuli scores 3. This is a fundamental clinical fact tested repeatedly in NEET-PG and the wrong key would directly mislead candidates.

Recommended disposition

  • 7b353a48Disable. The stem is ambiguous (adult vs. neonate rate conflated), the key is wrong, and the concept is already well-covered by higher-quality items in standard question banks. Do not attempt a patch; rewrite from scratch if CPR rate needs coverage.
  • 55e6a279Disable. The key is unambiguously wrong. The concept (minimum GCS = 3) is simple enough that a clean replacement item can be written in minutes.

2. Broken Delivery (Missing Image, Malformed Options, Incomplete Stem)

Why this pattern is bad

An image-dependent item where the image is absent is not a question — it is an unanswerable prompt. Serving such items to candidates produces random guessing, erodes platform credibility, and generates legitimate complaints. Items with options labelled only "A," "B," "C," "D" (referring to image labels) are equally broken without the image. Items whose stems contain only "Comment on the diagnosis" or "What is the clinical diagnosis?" with no clinical text are completely non-functional in a text-only rendering.

How it shows up

Approximately 30–35 items in the reviewed set are image-dependent. The question text contains phrases such as "shown below," "the image shows," "the following ECG finding," "the above marked structures," or "the picture of." In every case the image is absent from the data provided. This is the single largest operational problem in the sample.

Specific sub-patterns observed:

  • Pure image prompts with no clinical text: The entire question is "The image shows a child with?" (5ce0047b) or "What is the clinical diagnosis?" (f5c53ddc) or "Comment on the diagnosis" (193ad435, 862f84d6). Without the image these are completely unanswerable.
  • Options that are image coordinates: "Which of the following sites is the most common site of intraparenchymal bleeding? A / B / C / D" (60d71a62). Without the labelled diagram, the options are meaningless.
  • Stems that reference "the marked wave," "the marked structures," "the structure marked in red" with no image: 8901d0ae, c2e3c40a, 160305b9, e7f8e88c.
  • ECG-based items with no ECG: 5235e9ae, ec090734, 65b88b3f, 193ad435, ec090734.

Example question IDs with short explanations

  • b08741c1 — "Which pattern of breathing is shown below?" No waveform image present. Unanswerable.
  • 5ce0047b — "The image shows a child with?" No image. No clinical description. Completely non-functional.
  • 60d71a62 — Options are literally "A," "B," "C," "D" referring to a brain diagram that is absent.
  • 160305b9 — "The above marked structures are seen in a condition where the most common cause of death is?" No image, no condition named. Unanswerable.
  • 8901d0ae — "The normal amplitude of the marked wave is usually less than ___ mm in the chest leads." No ECG image. The blank in the stem is also a formatting artefact.
  • c1a1f4da — "In the score shown below, alphabet P stands for?" No score image. Unanswerable.
  • ccc61dba — "The following test is performed for evaluating the integrity of:" No image of the test. Unanswerable.

Recommended disposition

All image-dependent items must be held from live serving immediately until the corresponding image is confirmed embedded and renders correctly across all delivery surfaces. After image restoration:

  • Items with rich clinical vignettes in the stem (e.g., 316aed4a, 60350736, a7fd2a3b, 7f9248f7) should be kept once the image is verified — the clinical framing is adequate.
  • Items whose entire stem is a one-line image prompt with no clinical context (5ce0047b, 862f84d6, 193ad435, 160305b9, f5c53ddc, b08741c1) should be fixed by adding a minimal clinical vignette so the item is partially answerable even if the image degrades, and so the cognitive demand rises above pure visual pattern recognition.
  • Items with options that are only image coordinates (60d71a62) should be fixed by replacing coordinate labels with anatomical names.

3. Low-Value But Correct (Too Simple, Low-Yield, Trivia-Heavy, Weak Exam Relevance)

Why this pattern is bad

NEET-PG and AIIMS-PG test clinical application, not encyclopaedic recall of eponyms, historical facts, or single-number thresholds that any candidate can memorise in thirty seconds. Items at Blooms level 1 that ask for a name, a number, or a definition without any clinical context do not discriminate between candidates who understand medicine and those who have merely memorised a list. They also inflate the apparent difficulty distribution, making the bank look more balanced than it is. Nursing-process items ("Which nursing intervention…") are written for a different examination audience entirely and have no place in a physician-facing NEET-PG bank.

How it shows up

This is the most numerically prevalent problem in the sample. The following sub-types are observed:

Medical history trivia

  • a4bac6bb / a4bac6bb (risky duplicate) — "Who discovered percussion?" This is pure historical trivia. It does not test clinical competence.
  • 1f5accff — "Who invented the modern stethoscope?" Same problem.
  • 3d628f15 — "Identify the scientist given in the image below" (Louis Pasteur). Trivia plus broken image.

Single-number recall with no clinical context

  • e344cd1a — "Daily temperature variation in remittent fever is:" A threshold number with no patient scenario.
  • cbbb5458 — "Pleural effusion can be detected clinically when the amount of fluid is more than:" Pure recall.
  • 44ad9920 — "A patient is said to have chronic diarrhea if it is occurring for more than how many weeks?" Pure recall.
  • 0760910a — "What is the formula for parenteral iron therapy?" Formula recall with no clinical application.
  • 3884f507 — "RBCs should be transfused using a needle of which of the following sizes?" Procedural trivia.
  • ff3f07d6 — "One unit of PRBC raises HCT by:" Recall of a number.

Eponym/instrument recall

  • 012dc381 — "Menghini needle is used for biopsy of which of the following organs?" Eponym recall, Blooms 1.
  • 5fce7bfc — "All are splenic percussion techniques, EXCEPT: Nicolsky's sign." The answer is obvious to anyone who has heard of Nikolsky's sign in dermatology; no clinical reasoning required.
  • 3da99430 — "Puddle sign is seen in:" Single-association recall.
  • 665ed447 — "What is the term for gas-filled cysts found in the subserosa or submucosa…?" Definition recall.

Nursing-process items (wrong audience)

  • 025b30a8 — "A patient is diagnosed with constipation. Which nursing intervention is appropriate…?" This is a nursing board question, not a NEET-PG physician question.
  • 8d6faba2 — "A client with jaundice is experiencing pruritus. Which nursing intervention would be included in the care plan?" Same problem. "Client" language is a further marker of nursing-exam origin.

Trivially obvious clinical questions

  • 4193718e — Anaphylaxis after eating with facial swelling, hypotension, pruritus, and "feeling of impending doom." The diagnosis is given away by the stem; distractors (MI, food stuck in throat) are not plausible alternatives for a NEET-PG candidate.
  • 62d2535d — "Most common symptom of duodenal ulcer?" Epigastric pain. No clinical reasoning required.
  • 75ff46c6 — "Most common cause of pulmonary thromboembolism?" DVT. Blooms 2 but trivially easy for the target audience.
  • ba07959d — "Which metabolic abnormality is seen in multiple myeloma?" Hypercalcemia. Blooms 1, universally known.
  • 61402e42 — "Most common cause of aortic aneurysm?" Atherosclerosis. Blooms 1.

Recommended disposition

  • Medical history trivia (a4bac6bb, 1f5accff, 3d628f15): Disable. No exam-relevant value.
  • Nursing-process items (025b30a8, 8d6faba2): Disable. Wrong audience; concept can be tested in a physician-appropriate format if needed.
  • Single-number recall with no vignette (e344cd1a, cbbb5458, 44ad9920, 0760910a, 3884f507, ff3f07d6, 012dc381, 3da99430, 665ed447): Disable unless the concept is genuinely high-yield and no better item exists, in which case fix by embedding the number in a clinical scenario.
  • Trivially obvious clinical questions (4193718e, 62d2535d, 75ff46c6, ba07959d, 61402e42): Disable. These concepts are better served by higher-Blooms items already present or easily written.

4. Worthwhile Concept, Weak Execution (Keep the Concept, Fix the Stem/Options/Vignette)

Why this pattern is bad

Several items in the sample address genuinely high-yield NEET-PG topics — cannon A waves, Mobitz II post-MI, Wegener's granulomatosis, tuberous sclerosis, sickle cell acute chest syndrome, aluminium phosphide poisoning — but the execution is flawed in ways that reduce discriminative value or introduce ambiguity. These are worth fixing rather than discarding because the underlying concept is exam-relevant.

How it shows up

Ambiguous or incomplete stems that rely entirely on a missing image

Several items have a good clinical vignette but then pivot to "the image shows" or "ECG was performed" without embedding the image. The vignette alone is often enough to answer the question, which means the image is either redundant or the question is broken. Examples:

  • e8bba677 (aluminium phosphide poisoning) — The vignette is excellent and clinically rich. The "except" question about management is high-yield. However, "ECG was performed" is referenced but the ECG is absent. The correct answer (emergent gastric lavage is contraindicated) is answerable from the vignette alone, so this item can be fixed by removing the ECG reference or embedding the ECG.
  • 1811e581 (sickle cell acute chest syndrome) — Good concept, good "except" format. The "hematological abnormality" image is absent but the vignette (severe chest pain, difficulty breathing in a patient with a haematological condition) is sufficient context. Fix by embedding the peripheral smear image or making the haematological diagnosis explicit in the stem.
  • 316aed4a (Wegener's granulomatosis / GPA) — Excellent vignette: smoker, chronic sinusitis, haemoptysis, puffy eyelids, RBC casts on urine microscopy, CT chest. The diagnosis is fully derivable from the text. The CT reference adds nothing without the image. Keep — this is one of the stronger items in the sample. Embed CT image as supplementary.

"All except" items where the exception is not clearly wrong

  • cbc4b80e — "All are correct about the condition shown below except: Calcification of interosseous membrane." The condition is not named in the stem (image absent). The correct answer (calcification of interosseous membrane is NOT a feature) is only verifiable if the condition is known. Without the image this is broken. With the image it is a reasonable Blooms-5 item. Fix by naming the condition in the stem or embedding the image.
  • e11c9245 — "Which of the following is responsible for development of the disease, whose Barium swallow is shown below?" The correct answer is "All of the above" (myenteric + Meissner + Auerbach plexus damage). "All of the above" is a weak option in MCQ design because it rewards test-taking strategy over knowledge. The concept (achalasia pathophysiology) is high-yield. Fix by replacing "All of the above" with a specific incorrect option and restructuring the stem to name the condition.

Distractors that are too weak or implausible

  • 5eb14b13 — Plummer-Vinson syndrome: koilonychia + iron deficiency + dysphagia in a 40-year-old woman. The diagnosis is given away by the triad. Distractors (achalasia, oesophageal stricture, "none of the above") are not competitive. "None of the above" is a particularly poor distractor in a diagnosis question. Fix by replacing "none of the above" with a plausible alternative (e.g., Zenker's diverticulum or oesophageal web from another cause) and adding one more discriminating clinical detail.
  • fefd0363 — "Which organ is least commonly affected by arterial thromboembolism? Liver." The liver's dual blood supply is the reason, which is a teachable concept. However, the distractors (kidney, brain, heart) are not well-calibrated — the heart is an unusual distractor for arterial embolism destination. Fix by replacing "heart" with "spleen" or "intestine" to make the distractor set more clinically coherent.

Stems with embedded answer-giveaway language

  • 9abc6f77 — "Spurious hypertension is seen in which of the following situations? Auscultatory gap." The term "spurious" in the stem and "auscultatory gap" as the answer are both correct, but the other options (small cuff size, thick calcified vessels, obesity) all cause true or artifactual hypertension, not spurious hypertension in the strict sense. The conceptual distinction between spurious and artifactual hypertension is genuinely testable, but the stem needs to define what it means by "spurious" or the item will generate legitimate disputes. Fix by clarifying the stem.

CPR guideline item with wrong key (also flagged under Category 1)

  • a8fba74f — "Which of the following is NOT a major change in the 2010 AHA Guidelines for CPR for lay rescuers?" The correct answer (emphasis on 'Look, Listen, and Feel' was removed, not added, in 2010) is factually correct. This is a better-constructed item than 7b353a48 on the same topic. However, the stem is dated (2010 guidelines; current guidelines are 2020) and the concept of what was removed from guidelines is a nuanced negative-knowledge question that is appropriate for NEET-PG. Fix by updating to 2020 AHA guidelines and verifying the key against the current version.

Recommended disposition

  • e8bba677, 1811e581: Fix — embed image or make diagnosis explicit in stem.
  • 316aed4a: Keep — strong vignette, correct key, high-yield concept. Embed CT image as enhancement.
  • cbc4b80e, e11c9245: Fix — name the condition in the stem; replace "All of the above."
  • 5eb14b13: Fix — replace "none of the above" distractor; add one discriminating clinical detail.
  • fefd0363: Fix — replace "heart" distractor with a more plausible option.
  • 9abc6f77: Fix — clarify "spurious" in the stem.
  • a8fba74f: Fix — update to 2020 AHA guidelines and re-verify key.

5. Repetitive or Duplicative Coverage

Why this pattern is bad

Duplicate items waste bank capacity, inflate apparent coverage, and — when both versions are served — expose candidates to the same question twice in a session or across sessions, which undermines test security and perceived fairness.

How it shows up

Several items appear in both the generic and risky pools with identical question text and identical keys, indicating they have been ingested twice under different IDs or flagged into both pools without deduplication.

Example question IDs with short explanations

  • a4bac6bb (generic) and a4bac6bb (risky) — "Who discovered percussion?" — Same question ID appears in both pools. This is a direct duplicate.
  • 025b30a8 (generic) and 025b30a8 (risky) — "A patient is diagnosed with constipation. Which nursing intervention…" — Same question ID in both pools.
  • e344cd1a (generic) and e344cd1a (risky) — "Daily temperature variation in remittent fever" — Same ID in both pools.
  • eed7fa42 (generic) and eed7fa42 (risky) — "Which of the following is correct about the image shown? Dracunculus" — Same ID in both pools.
  • 80425c89 (generic) and 80425c89 (risky) — "Mucinous ascites" — Same ID in both pools.
  • fbbf4ac9 (generic) and fbbf4ac9 (risky) — "Small size BP cuff" — Same ID in both pools.
  • 5fce7bfc (generic) and 5fce7bfc (risky) — "Splenic percussion techniques" — Same ID in both pools.
  • 8d6faba2 (generic) and 8d6faba2 (risky) — "Nursing intervention for jaundice pruritus" — Same ID in both pools.
  • ba07959d (generic) and ba07959d (risky) — "Metabolic abnormality in multiple myeloma" — Same ID in both pools.
  • 7b353a48 (generic) and 7b353a48 (risky) — "CPR compressions per minute" — Same ID in both pools (also flagged as wrong key).
  • 59abe921 (generic) and 59abe921 (risky) — "Rhabdomyolysis" — Same ID in both pools.

Beyond exact ID duplicates, there are near-conceptual duplicates on the same narrow topic:

  • fbbf4ac9 ("small BP cuff → false elevation") and 9abc6f77 ("spurious hypertension → auscultatory gap") cover adjacent but distinct concepts and can coexist if both are fixed; however, if the bank already has strong coverage of BP measurement errors, one may be redundant.
  • f6600a9b ("Chylous ascites caused by all except: colloid carcinoma of stomach") and 80425c89 ("Mucinous ascites seen in: stomach carcinoma") — both test ascites types with stomach carcinoma as the pivot. The conceptual overlap is high. One should be kept (the better-constructed one) and the other disabled.

Recommended disposition

  • All items with the same question ID appearing in both pools: deduplicate immediately — retain in whichever pool is appropriate based on quality flags, remove from the other.
  • f6600a9b vs 80425c89: Keep 80425c89 (simpler, cleaner stem) and disable f6600a9b unless the chylous ascites concept is not otherwise covered — in which case fix f6600a9b to be clearly about chylous (not mucinous) ascites with better distractors.

Prioritization

The table below ranks action categories by urgency and operational impact.

Priority Category Estimated Items Affected Recommended Action
1 — Immediate Wrong Key / Factually Unsafe 2 confirmed (7b353a48, 55e6a279) Disable before next live session
2 — Immediate Broken Delivery (missing image) ~30–35 items Hold from serving; restore images or fix stems
3 — High Deduplication (same ID in both pools) 11 confirmed ID pairs Deduplicate; apply disposition from other categories
4 — High Low-Value / Trivia / Nursing-process ~20–25 items Disable (most); fix a small subset if concept is high-yield
5 — Medium Worthwhile Concept, Weak Execution ~8–10 items Fix stems, options, or image dependency
6 — Lower Near-conceptual duplicates 2–3 pairs Disable weaker item of each pair

Example Keep / Fix / Disable Calls

Keep (as-is or after image verification)

316aed4a — Granulomatosis with polyangiitis (Wegener's). Excellent clinical vignette: smoker, chronic sinusitis, haemoptysis, periorbital oedema, RBC casts. Diagnosis is fully derivable from text. Distractors are plausible (endobronchial TB, PAN, LAM). Blooms 4. High-yield NEET-PG concept. Keep; embed CT image as supplementary enhancement.

85221b3a — Cannon A waves → third-degree heart block. Clean, text-only stem. Distractors are the correct differential (first-degree, second-degree, Mobitz II). Blooms 4. Classic NEET-PG question. Keep.

8661daec — Visceral leishmaniasis (kala-azar) with bone marrow aspiration showing LD bodies. Rich vignette. The "except" tests a specific pharmacological fact (miltefosine half-life and monotherapy use). Blooms 5. High-yield. Keep; verify image of bone marrow smear is embedded.

60350736 — Tuberous sclerosis with skin lesion, mental retardation, flank pain, CT showing fat-density lesions in kidney and liver. Blooms 4. Good distractor set (VHL, ADPKD, familial angiolipomatosis). Keep; verify skin lesion image is embedded.

7f9248f7 — Romana sign / Chagas disease / Triatomine bug. Excellent clinical scenario (diplomat from Peru, child with periorbital swelling). Distractors test related vector-borne disease signs. Blooms 4. Keep; verify image is embedded.

ec090734 — Pneumocephalus post facial trauma in a Parkinson's/AF patient on aspirin. Blooms 4. Good clinical context. Keep; verify CT image is embedded.


Fix

e8bba677 — Aluminium phosphide poisoning. Excellent vignette and high-yield "except" question. Fix: embed ECG image or remove the ECG reference from the stem; the vignette alone supports the question.

e11c9245 — Achalasia pathophysiology (Barium swallow). Fix: name the condition in the stem; replace "All of the above" with a specific incorrect option (e.g., "Damage to interstitial cells of Cajal only").

5eb14b13 — Plummer-Vinson syndrome. Fix: replace "None of the above" with a plausible distractor (e.g., Zenker's diverticulum); add one discriminating clinical detail to reduce giveaway.

a8fba74f — 2010 AHA CPR guidelines. Fix: update to 2020 AHA guidelines; re-verify key; the concept (what was removed vs. added) is genuinely testable.

9abc6f77 — Spurious hypertension / auscultatory gap. Fix: clarify "spurious" in the stem to distinguish from artifactual hypertension caused by cuff size or vessel calcification.

fefd0363 — Liver least commonly affected by arterial thromboembolism. Fix: replace "heart" distractor with "spleen" or "intestine" for a more clinically coherent distractor set.

cbc4b80e — "All correct except: calcification of interosseous membrane." Fix: name the condition in the stem (appears to be diaphyseal aclasia / hereditary multiple exostoses based on the options); embed image.


Disable

7b353a48 — CPR compression rate: wrong key (90/min marked correct; correct answer is 100–120/min for adults). Disable immediately.

55e6a279 — GCS for unresponsive patient: wrong key (5 marked correct; minimum GCS is 3). Disable immediately.

a4bac6bb — "Who discovered percussion?" Medical history trivia. No exam relevance. Disable.

1f5accff — "Who invented the modern stethoscope?" Medical history trivia. Disable.

3d628f15 — "Identify the scientist in the image" (Pasteur). Trivia plus broken image. Disable.

025b30a8 — "Which nursing intervention is appropriate for constipation?" Nursing-board language, wrong audience. Disable.

8d6faba2 — "Which nursing intervention for jaundice pruritus?" Same problem. Disable.

012dc381 — "Menghini needle is used for biopsy of which organ?" Eponym recall, Blooms 1, no clinical context. Disable.

e344cd1a — "Daily temperature variation in remittent fever." Pure threshold recall, no vignette. Disable.

44ad9920 — "Chronic diarrhea if occurring for more than how many weeks?" Pure recall. Disable.

0760910a — "Formula for parenteral iron therapy." Formula recall, Blooms 1. Disable.

3884f507 — "RBCs should be transfused using a needle of which size?" Procedural trivia. Disable.

665ed447 — "What is the term for gas-filled cysts in the subserosa?" Definition recall, Blooms 1. Disable.

3da99430 — "Puddle sign is seen in:" Single-association recall. Disable.

cbbb5458 — "Pleural effusion can be detected clinically when fluid is more than:" Threshold recall. Disable.

4193718e — Anaphylaxis after eating. Diagnosis given away by stem; distractors not plausible for NEET-PG level. Disable.

61402e42 — "Most common cause of aortic aneurysm?" Blooms 1, trivially easy. Disable.

ba07959d — "Metabolic abnormality in multiple myeloma?" Blooms 1, universally known. Disable.

8e63e17e — Aphthous ulcers: correct answer is "Any of the above." This is an inherently ambiguous key that cannot be defended as a single correct answer in a standard MCQ format. Disable.

f6600a9b — Chylous ascites "except" question: overlaps conceptually with 80425c89 (mucinous ascites); the "except" answer (colloid carcinoma of stomach) is debatable since pseudomyxoma peritonei from mucinous tumours can cause gelatinous/mucinous ascites, not chylous. Factual ambiguity plus duplication. Disable.