The teaching preparation problem
Every medical educator has experienced this: you are preparing a lecture for tomorrow morning. The topic is muscle-invasive bladder cancer. You need the treatment algorithm from the EAU guideline. You need the survival data from the latest randomised trial. You need a clear figure showing the staging system. You need the recommendation on neoadjuvant chemotherapy with the evidence level.
You know all of this information exists. You have read it. You may have even highlighted it. But right now, at 9 PM the night before the lecture, you cannot remember which guideline edition has the algorithm you liked, which paper had the survival curve you want to show, or which textbook chapter explained the staging system most clearly.
So you start opening PDFs. The EAU guideline — 200 pages, you scroll through looking for the algorithm. The Campbell-Walsh chapter — 80 pages, you search for "TNM" and get 50 hits. The SWOG trial — you cannot even find the PDF, was it in your Downloads folder or on the other laptop? Forty-five minutes later, you have found two of the four things you needed and your lecture preparation has barely started.
The bottleneck in medical teaching is not knowing what to teach. It is finding the specific figure, the specific data point, the specific recommendation across a sprawl of documents that you know you have read but cannot efficiently locate.
Why slide decks drift from primary sources
Here is an uncomfortable truth about medical education: most lecture slides are built from other lecture slides. A senior colleague shares their deck. You modify it. A trainee modifies your version. Three generations later, nobody knows where the figure on slide 14 originally came from. The survival data on slide 22 might be from a 2018 study, but the slide says "adapted from" without a proper citation. The treatment algorithm looks familiar but you cannot trace it to a specific guideline edition.
This happens not because educators are careless, but because going back to the primary source is too time-consuming to do for every slide. When you are preparing a 40-slide lecture covering multiple topics, the pragmatic choice is to reuse what already exists and update what you know has changed. The problem is that you do not always know what has changed. The staging system was revised in 2022. The chemotherapy recommendation shifted from conditional to strong in the latest update. The survival figures you are showing are from a trial that has since been superseded by longer follow-up data.
Slide decks drift from primary sources because the cost of going back to the source is too high. If you could find the original figure, the original data point, the original recommendation in 30 seconds instead of 30 minutes, you would do it every time.
How Medevidex changes the preparation workflow
The approach is simple in concept: upload your teaching source materials into a Medevidex collection, then query them when you need specific content for your lectures. The system searches across all your uploaded documents and returns cited passages with exact page references.
In practice, this transforms how you prepare teaching materials. Instead of opening ten PDFs and scrolling through each one, you ask a question. "Find the treatment algorithm for muscle-invasive bladder cancer." Medevidex retrieves the relevant figure from your EAU guideline with the page number. You click through to the source page, screenshot the figure for your slide, and add a proper citation. Total time: under a minute. Previous time: fifteen to twenty minutes, if you found it at all.
"What is the 5-year cancer-specific survival for pT3 renal cell carcinoma?" The system retrieves the passage from the specific paper in your collection that reports this data, citing the document and page. You have the number, the source, and the context — exactly what you need for a teaching slide that your trainees can trace back to the evidence.
"What is the current EAU recommendation on neoadjuvant chemotherapy for cT2 bladder cancer, and what is the evidence level?" The system finds the recommendation from your uploaded guideline, with the strength rating and the supporting references mentioned in that section. Your slide now shows not just the recommendation but the strength of evidence behind it — which is what you want your trainees to learn.
Finding figures, tables, and clinical images
Medical teaching relies heavily on visual content. Treatment algorithms, staging diagrams, anatomical illustrations, survival curves, forest plots — these are often the centrepiece of a lecture slide, not supporting material. And they are exactly what is hardest to find when you are scrolling through a 200-page guideline or a 50-page textbook chapter.
Medevidex indexes figures and tables alongside text during document processing. This means when you query for "radical cystectomy treatment algorithm," the system can retrieve not just a paragraph that mentions the algorithm but the actual figure that contains it. The citation tells you which document and which page, so you can go directly to the source and extract the visual for your lecture.
This is particularly valuable for clinical images. If you are teaching a session on cystoscopic findings, the images in your textbook chapters are the teaching material. Being able to search for "papillary bladder tumour cystoscopy appearance" and retrieve the relevant figure from your uploaded textbook — with the proper caption and page reference — means your teaching slides are grounded in published, citable content rather than random images pulled from an internet search.
In medical education, figures are not illustrations — they are evidence. A treatment algorithm from a guideline carries the weight of a systematic review and expert consensus. That provenance matters, and proper citation preserves it.
Building a reusable teaching library
The real power emerges when you build a structured teaching library over time. Instead of uploading documents ad hoc before each lecture, you create collections that mirror your teaching responsibilities.
A urology educator might organise collections by rotation: one for undergraduate teaching covering core urology topics, one for postgraduate exam preparation with the key guidelines and landmark papers, one for journal club with recent high-impact publications, and one for subspecialty topics like uro-oncology or endourology with the relevant chapters and guidelines for each.
Once these collections exist, they become a permanent, queryable resource. When you are assigned a lecture on upper tract urothelial carcinoma next month, you do not start from scratch — you query your uro-oncology collection. When a trainee asks you for the key paper on active surveillance for small renal masses, you do not search your email — you query your postgraduate collection and send them the citation.
Collections can also be updated incrementally. When a new guideline edition is published, upload it to the relevant collection. The new content is indexed alongside the existing material. When you query next time, the answers reflect the updated evidence. Your teaching library stays current without requiring a complete rebuild.
A well-maintained teaching library in Medevidex is not just a document store — it is an expert assistant that knows your curriculum, your source materials, and exactly where to find the figure, data point, or recommendation you need.
Generating exam-style questions from source material
Another practical application for educators is generating assessment content. Writing good exam questions is time-consuming because each question needs to be grounded in specific, citable evidence — not in general knowledge or clinical impression.
With your teaching materials uploaded to Medevidex, you can query for the factual content that forms the basis of exam questions. "What are the indications for radical nephrectomy versus partial nephrectomy according to the EAU guideline?" The retrieved passage gives you the evidence-based answer, which you can then convert into a multiple-choice question with the correct answer directly traceable to the guideline.
This approach has a subtle but important benefit: it forces your exam questions to be evidence-based. When every question stem and every correct answer can be traced to a specific passage in a specific document, your assessment is testing whether trainees know the evidence — not whether they can guess what the examiner was thinking. This is how postgraduate exams should work, and Medevidex makes it practical to build them this way.
The attribution problem in medical education
Medical educators have an attribution problem, and most of us do not talk about it. We use figures from guidelines without proper citation. We quote statistics without source references. We present treatment algorithms without indicating which guideline body produced them or which edition they come from. We do this not out of dishonesty but out of practical necessity — finding and formatting the proper citation for every element on every slide is prohibitively time-consuming.
This matters more than it might seem. When a trainee sees a treatment algorithm on a slide, they should be able to trace it back to the source. Not just to confirm the content is accurate, but to learn the habit of evidence-based thinking — the discipline of knowing where your knowledge comes from and how strong the evidence behind it is.
When educators skip attribution, trainees learn to treat clinical knowledge as tribal lore rather than evidence-based practice. They learn to say "I was told" instead of "the EAU guideline recommends, with a strong evidence rating." This is a cultural problem in medical education, and while a software tool cannot solve it entirely, making proper attribution fast and easy removes the most common excuse for skipping it.
If citing your sources takes 30 seconds instead of 30 minutes, you will cite your sources. Medevidex makes attribution the path of least resistance.
Teaching trainees to use evidence directly
There is a broader pedagogical argument here. Medical education should teach trainees to go to the primary source — to read the guideline, not the summary; to read the trial, not the editorial; to check the recommendation grade, not just the recommendation. But we often teach the opposite by example: we present distilled summaries, cite from memory, and encourage trainees to "just know" the answer.
Medevidex can be used as a teaching tool in its own right. During a tutorial, instead of telling trainees the answer, ask them to query the collection. "What does the EAU guideline say about the role of lymph node dissection in radical cystectomy?" They ask the system, read the retrieved passage, and discuss it — learning not just the content but the practice of consulting the evidence.
This is closer to how they will work in clinical practice. When a consultant asks a registrar to look up the guideline recommendation before a multidisciplinary team meeting, the expectation is that they will find the relevant passage and present it with context. Medevidex is the tool that makes this process fast enough to do in real time — during the tutorial, during the ward round, during the MDT prep.
What Medevidex does not replace
Medevidex does not replace the educator. It does not design your curriculum. It does not sequence your lectures. It does not decide which topics to emphasise or which evidence to prioritise. It does not adapt to the level of your audience or gauge whether your trainees understood the last concept before moving to the next one. These are irreducibly human skills — the craft of teaching — and no retrieval system can substitute for them.
What it does replace is the mechanical drudgery of finding things in documents. The searching, the scrolling, the "I know I read this somewhere" cycle that consumes hours of preparation time every week. The intellectual work of choosing what to teach, how to frame it, and how to assess understanding remains entirely with you. The mechanical work of finding the supporting evidence is handled by the system.
The result is not a shortcut — it is a reallocation. Time previously spent searching for content can now be spent designing better learning experiences, crafting more nuanced assessment questions, and engaging more thoughtfully with the material you are teaching. That is a trade-off worth making.
Read more
AI for Continuing Medical Education · AI for Medical Exam Preparation · Chat With Your Medical PDFs