Radiology Question Quality Review
Executive Summary
This review covers a candidate sample of 100 validated non-gold questions (60 generic + 40 risky) drawn from a pool of 5,382 Radiology items. No benchmark questions are available for this subject; the quality bar is set by 12 recent PYQs.
The headline finding is a severe Blooms-level compression toward the bottom of the taxonomy. In the candidate sample, 40% of questions sit at Blooms 1 and 50% at Blooms 2, leaving only 10% at levels 3–4 combined. The PYQ set, by contrast, contains multiple Blooms 4 image-based items requiring clinical synthesis. This gap is not a minor calibration issue — it reflects a structural over-production of recall and definitional questions that do not prepare candidates for the reasoning demands of INICET, NEET-PG, or FMGE.
Beyond the Blooms problem, the reviewed set shows five operationally distinct quality failures:
- A meaningful cluster of wrong-key or factually unsafe items where the marked correct answer is contestable or demonstrably incorrect.
- A large volume of low-value, low-yield trivia — questions about X-ray tube filament dimensions, DICOM acronym expansion, and bitewing angulation degrees — that have no realistic exam relevance.
- Several broken delivery items that depend on an image that is absent from the text, making them unanswerable without the visual.
- A smaller but important group of wrong-subject or wrong-topic placement items, particularly dental radiology questions filed under general Radiology topics.
- A sizeable group of worthwhile concepts with weak execution — the underlying clinical concept is exam-relevant but the stem is a bare one-liner with no clinical context, or the distractors are implausible.
Repetitive coverage is present but is a secondary concern relative to the above.
The overall disposition recommendation for this sample: approximately 20–25 items should be disabled outright, 35–40 items require substantive fixes before use, and the remainder can be kept with minor edits or as-is.
What Good Looks Like
The PYQ set provides a clear benchmark. The strongest items share four properties:
Clinical anchoring. The best questions embed the radiological finding in a patient scenario. PYQ 9a9d017c presents a cyanotic child with a chest radiograph; PYQ 57154e37 presents a 35-year-old with knee pain and an X-ray. The candidate must integrate clinical context with the image finding, not simply retrieve a memorised label.
Image dependency that is genuine, not decorative. PYQs 9a9d017c, 8137eba5, 95a5f2b9, and 57154e37 are all Blooms 4 items where the image is the question. The text alone cannot be answered. This is the correct use of image-based format in Radiology.
Distractor quality that reflects real diagnostic confusion. In PYQ e8d11636 (ground-glass haziness EXCEPT), all four options are plausible neonatal conditions; the correct answer requires knowing that left-to-right shunts do not produce ground-glass opacity in the same way. In PYQ 4ed124c2 (USG vs mammography), the distractors represent genuine advantages of USG, making the "not an advantage" framing meaningful.
Appropriate Blooms distribution. Even the easier PYQs (e.g., a1efe41d — Banana sign, Blooms 1) test a specific, high-yield radiological sign with direct exam precedent. The difficulty is justified by the concept's importance, not by the question's simplicity.
Correct use of "EXCEPT/NOT" format. When used, the negative format tests a complete conceptual map (e.g., all features of a condition) rather than a single isolated fact.
The candidate sample should be measured against this standard throughout.
Main Issue Categories
1. Wrong Key or Factually Unsafe
Why this pattern is bad
A wrong key is the most serious quality failure in a question bank. It directly harms candidates who reason correctly and are penalised for it. In a high-stakes PG entrance context, even a single wrong-key item in a live test can generate mass reports and erode platform credibility. Items in this category must be separated from low-quality-but-correct items because the remediation path is different: a wrong key requires expert clinical review and key correction or disabling, not just a stem rewrite.
How it shows up
In this sample, wrong-key risk appears in two forms: (a) items where the marked answer is factually incorrect by current consensus, and (b) items where the marked answer is one of several defensible answers and the question does not specify the clinical context needed to disambiguate.
Example question IDs and explanations
b30dec80 — "Which of the following are ultrasound signs of fetal death?"
The marked correct answer is "Hegar's sign." This is factually wrong. Hegar's sign is a clinical obstetric sign (softening of the lower uterine segment), not an ultrasound sign of fetal death. The genuine ultrasound signs of fetal death include absent cardiac activity, Spalding sign (overlapping skull bones), Robert's sign (gas in fetal vessels), and the halo sign. Three of the four options listed (halo sign, absent heartbeat, Spalding sign) are actual ultrasound signs of fetal death; the correct answer to "which of the following ARE ultrasound signs" should be one of those three, not Hegar's sign. This item has an inverted key — the marked answer is the one option that is definitively NOT an ultrasound sign. Disable immediately.
89302f96 — "Which scan is used to identify viable myocardium after myocardial infarction?"
The marked answer is PET scan. While FDG-PET is used for myocardial viability assessment, Thallium-201 scintigraphy is the classical and historically dominant answer in Indian PG exam tradition for this question, and both are defensible. The question does not specify "gold standard," "most accurate," or "most commonly used," making the key ambiguous. The distractor "Thallium 201 scan" is a strong competing answer that many well-prepared candidates will select. This item needs a stem qualifier or key revision. Fix: add "most accurate" or "gold standard" qualifier and verify against current guidelines.
d991eb12 — "Subcutaneous calcifications are seen in which of the following conditions?"
The marked answer is Hyperparathyroidism alone. However, subcutaneous calcifications are well-documented in multiple conditions including calcinosis cutis (associated with dermatomyositis, scleroderma), tumoral calcinosis, and also in hypoparathyroidism (basal ganglia calcification is more typical, but subcutaneous calcification is described). The option "All of the above" includes Gout (tophi are periarticular, not typically subcutaneous calcification in the radiological sense) and Ochronosis (which causes intervertebral disc and cartilage calcification, not subcutaneous). The question is ambiguous enough that the key is contestable. Fix: restrict to a specific clinical scenario or reframe as a more precise radiological finding.
c4c0eabe — "Gas absent from intestine (gasless abdomen) on x-ray is seen in which condition?"
The marked answer is Acute pancreatitis. This is factually questionable. Acute pancreatitis classically produces a sentinel loop sign (localised ileus) or colon cut-off sign — both of which involve gas. A truly gasless abdomen is more characteristic of high small bowel obstruction (e.g., duodenal atresia in neonates), or when the bowel is fluid-filled (as in some cases of intussusception or meconium ileus). The association of gasless abdomen with acute pancreatitis is not a standard teaching point in major Indian radiology textbooks. Fix: verify against Sutton/Grainger or disable if unsupported.
ee74a6e3 — "What is the recommended dose of I-131 for carcinoma of the thyroid?"
The marked answer is "Both A and B" (30–100 mCi low-risk + 100–200 mCi high-risk). The "Both A and B" construction is a structurally weak option type that functions as a combination answer. More critically, I-131 dosing protocols vary by institution and guideline (ATA 2015 guidelines use risk-stratified dosing with different thresholds), and the specific numeric ranges given do not match a single authoritative source cleanly. This item is factually imprecise and uses a problematic option format. Fix: rewrite with a single defensible numeric range from a named guideline, or disable.
Recommended disposition: Disable b30dec80. Fix 89302f96, d991eb12, c4c0eabe, ee74a6e3 after expert clinical review.
2. Wrong Subject or Wrong Topic Placement
Why this pattern is bad
Subject contamination creates two operational problems. First, it misleads candidates about what Radiology covers, potentially displacing genuinely high-yield Radiology content. Second, it creates scoring anomalies when questions are served in subject-specific practice sets. Dental radiology questions filed under general Radiology topics are a specific and recurring problem in this sample — they belong to a different clinical discipline with a different candidate population.
How it shows up
In this sample, the contamination is primarily dental radiology content appearing under "Radiological Anatomy" and "Contrast and Radiological Procedures" topic labels. These questions test dental-specific knowledge (bitewing radiograph angulation, radiopacity of dental materials, periapical cemental dysplasia) that is irrelevant to MBBS-stream PG entrance examinations.
Example question IDs and explanations
ce94779d — "What is the recommended angulation when taking a bitewing radiograph to prevent overlapping of the cusps on the occlusal surface?"
This is a dental radiology question. Bitewing radiographs are a dental diagnostic technique. The topic is filed under "Contrast and Radiological Procedures" in a general Radiology bank. No INICET, NEET-PG, or FMGE question for MBBS graduates tests bitewing angulation. Disable: wrong subject for this bank.
8c88d97e — "Which of the following can be detected radiographically? (Interproximal caries / Root caries / Deep caries)"
This is a dental radiology question about caries detection. Filed under "Radiological Anatomy." Completely outside the scope of medical PG Radiology. Disable: wrong subject.
f8434556 — "Which of the following interpretations cannot be associated with the given radiograph? (Recurrent pericoronitis / Periapical cemental dysplasia / Peripheral sclerosing osteitis / Impacted 3rd molar)"
All four options are dental pathology terms. This is a dental radiology image question filed under "Radiological Anatomy." Additionally, it is a broken delivery item (image absent — see Category 3). Disable: wrong subject and broken delivery.
fc6dd7db — "All of the following materials are radiopaque in nature except? (Zinc phosphate / Composite / Gutta percha / Amalgam)"
Zinc phosphate cement, composite resin, gutta percha, and amalgam are all dental restorative materials. This is dental radiology content. Filed under "Radiological Anatomy." Disable: wrong subject.
6fb58e7e — "Excessive amounts of which of the following components will turn the solution milky? (Sodium sulfite / Ammonium sulfite / Ammonium thiosulphate)"
This appears to test photographic darkroom chemistry (fixer solution composition), which is an obsolete topic in the era of digital radiology and is not tested in any current Indian PG entrance examination. The clinical relevance is zero. Filed under "Contrast and Radiological Procedures." Disable: wrong topic and zero exam relevance.
Recommended disposition: Disable all five items above. Audit the full "Radiological Anatomy" and "Contrast and Radiological Procedures" topic buckets for further dental/darkroom chemistry contamination.
3. Broken Delivery (Missing Image, Malformed Options, Incomplete Stem)
Why this pattern is bad
Image-dependent questions without an image are completely unanswerable. They cannot be served to candidates in any format. Unlike conceptual quality problems, broken delivery is a binary failure — the item is non-functional regardless of how good the underlying concept is. These items must be identified and disabled or held until the image is attached and verified.
A secondary broken delivery problem in this sample is the use of "All of the above" and "Both A and B" as answer options. These are structurally weak option formats that inflate guessing probability, reduce distractor quality, and are explicitly avoided in high-quality PG question banks.
How it shows up
Multiple items in the sample have stems that say "the given CT scan shows," "the given NCCT shows," "which is true about the CT angiography report shown," or "which of the following conditions is characterized by the finding shown" — but no image is present in the question text. These items are entirely dependent on a visual that is absent.
Example question IDs and explanations
0b3207b6 — "The given CT scan shows which brain lesion?"
Options include Subdural hematoma, Brain abscess, Medulloblastoma, Oligodendroglioma. No image is present. Without the CT image, this question cannot be answered — the options span completely different pathologies with different CT appearances. The stem provides zero clinical context. Disable until image is attached and verified.
feabcddc — "A 28-year-old female with CKD presents with bone pain and elevated PTH. What is the sign visible in the image?"
The clinical vignette is well-constructed (CKD + elevated PTH strongly suggests renal osteodystrophy / rugger-jersey spine), but the question explicitly asks about "the sign visible in the image" and no image is present. The stem context is strong enough that this could be rewritten as a text-only question ("Which of the following spinal X-ray signs is characteristic of renal osteodystrophy?"), but as currently written it is broken. Fix: either remove the image reference and reframe as a text question, or attach the correct image.
6641c8e0 — "Which is true about the CT angiography report shown?"
No image or report is present. Options include Fibromuscular dysplasia, Hydronephrosis, Duplication of renal collecting system, Renal artery stenosis. Completely unanswerable without the image. Disable until image is attached.
53a499ba — "Which of the following conditions is characterized by the finding shown?"
No image present. The correct answer is "All of the above" (TB, Multiple myeloma, Osteogenesis imperfecta). The "All of the above" key is itself a quality problem. Without knowing what "the finding" is, the question is meaningless. Disable: broken delivery compounded by weak option format.
5d54abe5 — "A hypertensive patient was admitted with right hemiplegia. A plain CT scan shows what?"
The stem implies an image is being shown ("shows what?") but no image is present. However, this item is partially salvageable: the clinical scenario (hypertension + right hemiplegia) combined with the options (left thalamus/internal capsule hemorrhage vs. frontal lobe hemorrhage vs. ischemic infarct) could function as a text-based clinical reasoning question if the stem is rewritten to remove the image dependency. Fix: rewrite stem as "A plain CT scan of this patient would most likely show:" and remove the image reference.
151fc752 — "The given NCCT shows the presence of:"
No image present. Options are Left/Right intraventricular vs. Left/Right intraparenchymal hemorrhage. Completely unanswerable without the image. Disable until image is attached.
f8434556 — (also cited in Category 2) Broken delivery + wrong subject. Disable.
Additional broken option format issues:
3194e0ee — "Which radiologic feature is suggestive of achalasia cardia?" Correct answer is "All of the above." All three individual options (absent gastric air bubble, air-fluid level in mediastinum, sigmoid esophagus) are genuine features of achalasia. The "All of the above" key means a candidate who knows only one feature can guess correctly. This is a low-discrimination item. Fix: convert to a "NOT a feature of achalasia" format or select the single most specific/classic sign.
Recommended disposition: Disable 0b3207b6, 6641c8e0, 53a499ba, 151fc752, f8434556. Fix feabcddc, 5d54abe5, 3194e0ee.
4. Low-Value But Correct (Too Simple, Low-Yield, Trivia-Heavy, Weak Exam Relevance)
Why this pattern is bad
This is the largest single quality problem in the reviewed set. A question can be factually correct and still be harmful to a question bank if it trains candidates to memorise trivia rather than reason clinically. The Blooms 1 overload (40% of the candidate sample) is directly caused by this category. These items do not discriminate between prepared and unprepared candidates, they do not appear in recent PYQs, and they crowd out higher-quality content.
The specific sub-types observed in this sample are:
- Acronym/definition questions: Testing what an abbreviation stands for
- Equipment specification trivia: Filament dimensions, transducer materials, unit conversions
- Obvious "investigation of choice" questions: Where the answer is so universally known it provides no discriminatory value
- Single-sign recall with no clinical context: "Sign X is seen in condition Y" with no patient scenario
How it shows up
This pattern appears repeatedly across multiple topic areas. It is most concentrated in Radiation Physics, Radiobiology, and Nuclear Medicine topics.
Example question IDs and explanations
c7b4f150 — "What is meant by PET scan?"
Tests whether the candidate knows that PET stands for "Positron Emission Tomography." This is not a clinical reasoning question. No PG entrance examination tests the expansion of a universally known acronym. The distractors ("Positive Emission Tomography," "Positron Energy Tomography") are not plausible confusers for any prepared candidate. Disable.
ac5cf808 — "What is the full form of DICOM?"
Same problem as above. DICOM acronym expansion is not tested in INICET, NEET-PG, or FMGE. Disable.
75920376 — "What are the typical diameter and length of the filament in an X-ray tube?"
Tests memorisation of hardware specifications (2 mm diameter, <1 cm length). This is engineering trivia with zero clinical application. Not tested in any Indian PG entrance exam in the reviewed PYQ set. Disable.
e37070bf — "What material is typically found in an ultrasonography transducer?"
Lead Zirconate Titanate (PZT) is the correct answer. This is a physics fact with no clinical application for a medical PG candidate. The distractors (Sodium Fluoride, Caesium Fluoride, Barium titanate — note: Barium titanate is actually a historically used piezoelectric material, making this a potentially contestable key as well) add no clinical value. Disable.
59a0ab03 — "1 Sievert (Sv) is equal to how many rem?"
Unit conversion question. The answer (100 rem) is a physics constant. While radiation units are relevant to Radiation Physics, this specific conversion is tested at a level appropriate for physics undergraduates, not medical PG candidates who need to understand dose concepts clinically. Disable.
fbf4d484 — "What is the maximum permissible radiation exposure per year recommended by NCRP for a radiation worker?"
The answer (50 mSv) is a factual recall item. While radiation protection limits are clinically relevant, this question provides no clinical context and is a pure memorisation item. It could be salvaged with a clinical scenario (e.g., a pregnant radiation worker, or a scenario requiring dose calculation), but as written it is low-value. Fix: add clinical context, or disable if strong coverage exists elsewhere.
7c57676b — "Which of the following is the best initial imaging study for evaluating a suspected fracture?"
The answer is X-ray. This is so universally known that it provides zero discriminatory value. Any first-year medical student knows this. Tagged as PYQ/TELEGRAM but does not appear in the verified PYQ set. Disable.
d74f5605 — "Intracranial calcification is best diagnosed by CT Scan"
Correct but trivially so. CT is universally acknowledged as superior to plain X-ray for intracranial calcification. No clinical context. Disable.
a213918f — "What is the most common extra-axial intracranial tumor?"
Meningioma is correct. This is a Blooms 1 recall item with no clinical context. It appears in both the generic and risky samples (duplicate — see Category 5). Disable one instance; the concept could be retained in a clinical vignette format.
3333be25 — "The 'sandstorm' appearance on a chest radiograph is characteristic of which condition?"
Pulmonary alveolar microlithiasis is correct. This is a rare condition and a pure sign-to-diagnosis recall item. While the sign is occasionally tested, the question has no clinical context and the distractors (pulmonary edema, alveolar proteinosis, thromboembolism) are not plausible confusers for a candidate who knows the sign. Low discriminatory value. Disable or fix with clinical vignette.
2a1d94ac — "Which type of lung tumour responds best to radiotherapy?"
Small cell carcinoma is correct. This is a Radiation Oncology Basics question that is more appropriately placed in Oncology. As a Radiology question it is borderline wrong-subject. The concept is clinically relevant but the question is a bare recall item. Fix: add clinical context or move to Oncology.
ddb86937 — "What is the investigation of choice for evaluating a renal mass?"
CT scan is correct. Universally known. No clinical context. Disable.
ce0c8f7d — "Minimal ascites can be best detected by which imaging modality?"
USG is correct. Universally known. No clinical context. Disable.
1f3f27fa — "What is the investigation of choice for interstitial lung disease?"
HRCT is correct. Universally known. No clinical context. Disable.
4bcdd051 — "At what gestational age can ultrasound detect cardiac activity?"
5–6 weeks is correct. Pure recall. No clinical context. Low discriminatory value. Disable.
Recommended disposition: Disable the majority of items in this category. The underlying concepts for a few (radiation protection limits, investigation of choice questions) could be retained if rewritten with clinical vignettes — but given the volume of low-value items and the availability of better PYQ coverage, the conservative recommendation is to disable rather than speculatively rewrite.
5. Repetitive or Duplicative Coverage
Why this pattern is bad
Duplicate items in a bank create two problems: they inflate apparent coverage statistics while actually providing redundant practice, and they can cause the same question to appear twice in a single test session, which candidates notice and report. In a bank of 5,382 items, duplication is expected, but when duplicates appear within a 100-item sample, the underlying duplication rate in the full bank is likely substantial.
How it shows up
The most direct duplication observed in this sample is a question appearing in both the generic and risky candidate sets with the same question ID.
Example question IDs and explanations
a213918f — "What is the most common extra-axial intracranial tumor?"
This question ID appears in both the generic sample and the risky sample. It is a literal duplicate — same question ID, same text, same options, same key. This is a data integrity issue. Disable one instance; the surviving instance should be evaluated under Category 4 (low-value) and likely disabled as well.
Thematic near-duplicates (same concept, different wording) observed in the reviewed set:
Sestamibi scan for parathyroid:
63c22776("Sensitive investigation to detect ectopic parathyroid glands") and PYQ026f0a8f("Most useful investigation for localization of a parathyroid adenoma") cover the same concept at the same Blooms level. The PYQ version is gold-standard;63c22776is redundant. Disable63c22776.Investigation of choice questions: Multiple items ask "investigation of choice for X" (renal mass, ILD, fracture, small intestine tumor, ascites, carotid stenosis) with no clinical context. These are thematically repetitive and individually low-value. The pattern suggests a systematic over-production of this question type.
Radiosensitivity questions:
7905e47e("Which cells are most radiosensitive?"),a234bc29("Which tissue has maximum radiation dose tolerance?"), andfe615a1f("Which phase of the cell cycle is most radioresistant?") all test the same conceptual domain (radiation sensitivity hierarchy) at Blooms 1. One well-constructed clinical scenario covering this concept would replace all three.
Recommended disposition: Disable confirmed duplicate a213918f (one instance). Disable 63c22776. Flag the "investigation of choice" and "radiosensitivity" clusters for deduplication review across the full bank.
6. Worthwhile Concept, Weak Execution (Keep the Concept, Fix the Stem/Options/Vignette)
Why this pattern is bad
These items test concepts that are genuinely exam-relevant and appear in PYQs or high-yield topic lists, but the execution is poor enough that the item underperforms its potential. The most common failure modes are: (a) a bare one-liner stem with no clinical context where a vignette would add discriminatory value; (b) distractors that are implausible or that give away the answer; (c) "NOT/EXCEPT" format used on a concept where a positive question would be more natural and harder to game; (d) stems that are ambiguous about what is being asked.
This is the most actionable category for the content operations team because the concept investment is already made — only the execution needs repair.
How it shows up
Example question IDs and explanations
81c353a9 — "A 45-year-old man with AIDS presents with misperceptions, tiredness, and memory loss. Which imaging feature favors PML over HIV encephalopathy?"
This is the best question in the generic sample. It has a clinical vignette, a meaningful differential (PML vs. HIV encephalopathy), and the correct answer (bilateral asymmetrical peripheral subcortical lesions) requires genuine knowledge of PML's imaging pattern. The distractors (mass effect, contrast enhancement, bilateral symmetrical periventricular lesions) are all features of HIV encephalopathy, making them plausible confusers. Keep as-is. This is the template the content team should use for Neuroradiology items.
5cdfdec9 — "Which leukodystrophy characteristically presents with bilateral occipital lobe involvement?"
Adrenoleukodystrophy (ALD) is correct. The concept is high-yield and exam-relevant. However, the question is a bare one-liner. A stronger version would present a young boy with progressive visual loss and behavioral changes, with an MRI showing posterior white matter signal abnormality, and ask for the diagnosis. The current form is Blooms 2 recall; a vignette version would be Blooms 4 clinical reasoning. Fix: add clinical vignette.
c00e105c — "Which of the following is FALSE about Kerley's A lines?"
The correct answer is "Transverse lines at the lung base perpendicular to the pleura" — which is actually the description of Kerley B lines, not A lines. The concept (distinguishing Kerley A from B lines) is high-yield. However, the FALSE format with four technical descriptors is confusing. A better approach would be a clinical scenario (patient with mitral stenosis, chest X-ray showing specific lines) asking for identification. Fix: reframe with clinical context; consider converting to a positive identification question.
9ca5db72 — "Cavitating pulmonary lesions can be seen in the following EXCEPT: Sarcoidosis"
The concept is correct — sarcoidosis rarely cavitates (though it can in advanced disease, making this slightly contestable). The question is a useful concept but the bare list format without clinical context is weak. A stronger version would present a patient with specific clinical features and ask which condition is least likely to explain the cavitating lesion. Fix: add clinical context; verify that sarcoidosis is the unambiguous exception.
1ac6ca66 — "X-ray features of hypoparathyroidism are the following EXCEPT: Subperiosteal resorption"
Subperiosteal resorption is a feature of hyperparathyroidism, not hypoparathyroidism — so the key is correct. This is a high-yield concept (distinguishing hypo- from hyperparathyroidism on X-ray). The execution is a bare list. A clinical vignette (post-thyroidectomy patient with tetany, X-ray findings) would elevate this to Blooms 4. Fix: add clinical vignette.
297cbe74 — "Whole-body iodine scan after total thyroidectomy is not recommended for which type of thyroid cancer?"
Medullary thyroid cancer is correct (it does not take up iodine as it arises from parafollicular C-cells). This is a high-yield concept. The question is a clean one-liner that works reasonably well, but a clinical scenario (post-thyroidectomy patient with elevated calcitonin, asked about appropriate follow-up imaging) would be stronger. Keep with minor fix: add brief clinical context.
5324c498 — "What condition is characterized by a 'signet ring appearance' on CT?"
Bronchiectasis is correct. The concept is high-yield. But this is a pure sign-to-diagnosis recall item at Blooms 1. A better version would present a patient with recurrent chest infections and productive cough, with a CT chest showing dilated bronchi with thickened walls, and ask for the diagnosis or the name of the CT sign. Fix: add clinical vignette to elevate to Blooms 3–4.
f0be93b2 — "Maclean's sign is seen in which of the following conditions?"
Gouty arthritis is the marked answer. Maclean's sign (overhanging edge sign) is indeed associated with gout. However, this is an obscure eponymous sign that is rarely tested in recent PYQs. The concept of gout on X-ray is high-yield, but the specific eponym is low-yield trivia. Fix: reframe as "Which of the following is a characteristic X-ray finding in gouty arthritis?" with options including overhanging edge/rat-bite erosions, subchondral cysts, joint space narrowing, and periarticular calcification — this tests the same concept at higher clinical relevance.
687a1f48 — "In an elderly patient with sudden onset of severe lower back pain without trauma, which imaging study is most appropriate to diagnose a suspected vertebral compression fracture?"
X-ray lumbar spine is the marked answer. The clinical vignette is a good start, but the answer is debatable: MRI is actually more sensitive for acute vertebral compression fractures (especially to distinguish osteoporotic from malignant fractures) and is increasingly the preferred modality. X-ray is the appropriate first-line/initial investigation, but the question asks "most appropriate to diagnose" which could reasonably be MRI. Fix: change "most appropriate to diagnose" to "most appropriate initial imaging study" to make X-ray unambiguously correct.
621bc942 — "Stryker's view is used to visualize which abnormality in the shoulder joint?"
Recurrent subluxation (Hill-Sachs lesion) is correct. This is a specific radiological view question that is occasionally tested. The concept is valid but the question is a bare one-liner. A clinical scenario (young athlete with recurrent shoulder dislocation, asked which special view best demonstrates the bony defect) would be more appropriate. Fix: add clinical context.
2a68c2fa — "A 50-year-old female with difficulty swallowing and history of multiple CT scans of head and neck. Which cancer is she predominantly susceptible to?"
Papillary thyroid carcinoma is correct (radiation-induced thyroid cancer is predominantly papillary type). This is a good Blooms 3 application question. The vignette is reasonable. The main weakness is that the dysphagia detail is a red herring that could mislead candidates toward esophageal pathology. Fix: remove the dysphagia detail or make it relevant to the radiation exposure history (e.g., "she now presents with a neck mass").
Recommended disposition: Keep 81c353a9 as-is. Fix 5cdfdec9, c00e105c, 9ca5db72, 1ac6ca66, 297cbe74, 5324c498, 687a1f48, 621bc942, 2a68c2fa. Fix or disable f0be93b2 depending on whether the eponym has recent PYQ precedent.
Prioritization
The following table summarises the recommended actions by urgency and volume.
Tier 1 — Immediate action required (disable or expert review before any live use)
| Priority | Action | Rationale | Approximate item count in sample |
|---|---|---|---|
| 1A | Disable wrong-key items | Factually unsafe; harms candidates who reason correctly | 2–3 confirmed (b30dec80; review c4c0eabe) |
| 1B | Disable broken-delivery image-dependent items | Non-functional without image | 5–6 items |
| 1C | Disable wrong-subject (dental/darkroom) items | Irrelevant to MBBS PG stream | 4–5 items |
| 1D | Expert review of contestable keys | Ambiguous correct answer; risk of mass reports | 3–4 items |
Tier 2 — Fix before next content cycle
| Priority | Action | Rationale | Approximate item count in sample |
|---|---|---|---|
| 2A | Add clinical vignette to bare one-liner items with high-yield concepts | Elevates Blooms level; improves discriminatory value | 10–15 items |
| 2B | Fix stems with image references but no image (partial salvage) | Reframe as text-only questions | 2–3 items |
| 2C | Fix ambiguous "investigation of choice" stems with clinical context | Reduces contestability | 3–4 items |
Tier 3 — Disable in bulk (low-value trivia)
| Priority | Action | Rationale | Approximate item count in sample |
|---|---|---|---|
| 3A | Disable acronym/definition questions (PET, DICOM, CTDI) | Zero exam relevance; Blooms 1 trivia | 3 items |
| 3B | Disable equipment specification questions (filament dimensions, transducer material) | Engineering trivia; not tested in PG exams | 2–3 items |
| 3C | Disable universally-known "investigation of choice" questions with no clinical context | No discriminatory value | 5–6 items |
| 3D | Disable duplicate items | Data integrity | 1 confirmed duplicate |
Structural recommendation: The Blooms distribution must be actively corrected. The target for a Radiology bank serving INICET/NEET-PG candidates should be approximately 15% Blooms 1, 30% Blooms 2, 30% Blooms 3, and 25% Blooms 4. The current sample (40% Blooms 1, 50% Blooms 2, 10% Blooms 3–4) is inverted relative to this target. New content production should be paused on Blooms 1 items and redirected toward clinical vignette and image-based items at Blooms 3–4.
Example Keep / Fix / Disable Calls
The following provides concrete, actionable disposition calls for representative items from the reviewed set.
KEEP
81c353a9 — AIDS patient, PML vs. HIV encephalopathy imaging differentiation. Clinical vignette, plausible distractors, Blooms 2 (could be argued Blooms 3), high exam relevance. Keep as-is.
5cdfdec9 — Adrenoleukodystrophy with bilateral occipital involvement. Correct, high-yield concept. Keep concept; fix execution (add vignette) — but the bare version is acceptable as a holding item.
297cbe74 — Iodine scan not recommended for medullary thyroid cancer. Correct, high-yield, clean distractors. Keep with minor stem improvement.
1ac6ca66 — Hypoparathyroidism X-ray features EXCEPT subperiosteal resorption. Correct, high-yield concept distinguishing hypo- from hyperparathyroidism. Keep concept; fix with clinical vignette for higher Blooms.
a3d2971b — Mammogram features of malignant tumor EXCEPT macrocalcification. Correct, high-yield, good distractor set. Keep.
FIX
feabcddc — CKD patient with elevated PTH, rugger-jersey spine. Strong clinical vignette but broken by image reference. Fix: remove "visible in the image" and reframe as "Which of the following spinal X-ray signs is characteristic of renal osteodystrophy in this patient?"
5d54abe5 — Hypertensive patient with right hemiplegia, CT findings. Fix: remove image dependency from stem; rewrite as "A plain CT scan of this patient would most likely show:" — the clinical reasoning is preserved without requiring an image.
687a1f48 — Elderly patient, vertebral compression fracture, imaging choice. Fix: change "most appropriate to diagnose" to "most appropriate initial imaging study" to make X-ray unambiguously correct.
2a68c2fa — Radiation exposure and thyroid cancer type. Fix: remove the dysphagia red herring; replace with "she now presents with a palpable neck mass" to make the clinical scenario coherent.
3194e0ee — Achalasia cardia radiological features. Fix: convert from "All of the above" key to a "NOT a feature of achalasia" format, or select the single most specific sign as the key.
c00e105c — Kerley's A lines, FALSE statement. Fix: reframe with a clinical scenario (mitral stenosis patient, chest X-ray) and ask for identification of the specific line type, rather than testing a FALSE statement about technical descriptors.
9ca5db72 — Cavitating pulmonary lesions EXCEPT sarcoidosis. Fix: add clinical context (patient with specific features) and verify that sarcoidosis is the unambiguous exception given that cavitation can rarely occur in advanced sarcoidosis.
89302f96 — Viable myocardium identification after MI. Fix: add "gold standard" or "most accurate" qualifier to the stem to disambiguate PET from Thallium-201.
DISABLE
b30dec80 — Hegar's sign as ultrasound sign of fetal death. Wrong key. Hegar's sign is a clinical obstetric sign, not an ultrasound finding. Disable immediately.
c7b4f150 — "What is meant by PET scan?" Acronym definition. Zero exam relevance. Disable.
ac5cf808 — "What is the full form of DICOM?" Acronym definition. Zero exam relevance. Disable.
75920376 — X-ray tube filament dimensions. Engineering trivia. Disable.
e37070bf — Ultrasonography transducer material. Physics trivia with potentially contestable key (barium titanate is a historical piezoelectric material). Disable.
ce94779d — Bitewing radiograph angulation. Dental radiology. Wrong subject for this bank. Disable.
8c88d97e — Radiographic detection of caries types. Dental radiology. Wrong subject. Disable.
fc6dd7db — Radiopacity of dental materials. Dental radiology. Wrong subject. Disable.
6fb58e7e — Fixer solution chemistry (ammonium sulfite turning milky). Obsolete darkroom chemistry. Wrong topic, zero exam relevance. Disable.
0b3207b6 — "The given CT scan shows which brain lesion?" No image present. Disable until image attached and verified.
6641c8e0 — "Which is true about the CT angiography report shown?" No image present. Disable until image attached.
53a499ba — "Which condition is characterized by the finding shown?" No image + "All of the above" key. Disable.
7c57676b — X-ray as best initial study for fracture. Universally known. Zero discriminatory value. Disable.
ddb86937 — CT scan for renal mass evaluation. Universally known. Disable.
ce0c8f7d — USG for minimal ascites detection. Universally known. Disable.
1f3f27fa — HRCT for interstitial lung disease. Universally known. Disable.
a213918f (duplicate instance) — Most common extra-axial intracranial tumor. Confirmed duplicate across generic and risky samples. Disable one instance; evaluate the surviving instance under Category 4.
63c22776 — Sestamibi scan for ectopic parathyroid. Redundant with PYQ 026f0a8f which covers the same concept at the same Blooms level. Disable.
59a0ab03 — 1 Sievert = 100 rem. Unit conversion trivia. Disable.
fbf4d484 — NCRP maximum permissible dose 50 mSv. Bare recall, no clinical context. Disable (or fix with clinical scenario if radiation protection coverage is thin in the bank).