Skip to main content

Links, Images & Your First Complete Specification

Completing Your Markdown Foundation

You've learned the core building blocks: headings for structure, lists for organization, code blocks for examples, and emphasis for highlighting. Now you'll learn the final two pieces that make specifications truly complete: links to connect to external resources, and images to show what things look like.

This lesson brings together everything you've learned in Lessons 1-4 plus these final elements. By the end, you'll write your first complete specification – one that an AI agent could actually implement from.


Concept 1: Emphasis - Making Text Stand Out

Markdown gives you two ways to emphasize text: bold and italic.

When to Use Bold

Use bold for terms that are critical to understanding your specification:

  • Feature names: "Add tasks to the list" (shows the exact feature name)
  • Important constraints: "The app MUST run offline" (shows non-negotiable requirements)
  • Key terms: "Use UUID for unique identifiers" (defines technology choices)
  • Action words: "Validate user input before saving" (shows what must happen)

When to Use Italic

Use italic for emphasis or definitions:

  • Emphasis on importance: "This feature is especially critical for users with slow connections"
  • Introduced terms: "A specification is a document describing what to build"
  • Conditional emphasis: "Only users with admin roles can delete tasks"

The Syntax (Direct Teaching)

Markdown uses very simple patterns for emphasis:

Bold - surround text with **double asterisks**:

This is **important**.

Italic - surround text with *single asterisks*:

This is *emphasized*.

That's it! No complex syntax to memorize. The pattern is:

  • Two asterisks on each side = bold
  • One asterisk on each side = italic

Example 1: README with Proper Emphasis

Here's a real specification that emphasizes key information:

# Todo List Application

## Problem
Users need a **simple, fast** way to manage their daily tasks. Existing todo apps are too complex.

## Solution
Create a **command-line todo app** that is easy to use and **runs locally** (no internet required).

## Features
- **Add** tasks with a short description
- **List** all tasks with their status
- **Mark** tasks as complete
- **Delete** completed tasks
- **Save** tasks to a file so they persist between sessions

## Expected Output
When a user runs `list`, the app should show:

Your tasks:

  1. Buy groceries [pending]
  2. Finish homework [pending]
  3. Call mom [complete]

## Why This Matters
By using **bold** on feature names and key constraints, anyone reading this specification – human or AI – immediately understands what's *most important*. This clarity **reduces misunderstandings** and helps AI agents generate more accurate code.

Notice how the emphasis makes it easy to scan and understand what matters. Without bold, every word has equal weight and it's harder to find the key ideas.


A specification doesn't exist in isolation. You often need to reference external documentation, examples, or standards. Links solve this problem.

When you write "Use Python's requests library," the reader might not know:

  • What is the requests library?
  • Where do I find it?
  • How do I use it?

But if you write "Use Python's requests library," the reader can click through to complete documentation instantly.

The Syntax (Direct Teaching)

Markdown links are also very simple:

[link text](url)
  • link text = what the reader sees and clicks
  • url = where the link goes

Example:

Read the [Python documentation](https://docs.python.org/) for help.

That renders as: Read the Python documentation for help.

Beginner mistake:

[Wrong link](https://docs.python.org/3/ reference guide)

This won't work because the space breaks the URL. Either:

  1. Use a URL without spaces (recommended):
[Python reference](https://docs.python.org/3/reference/)
  1. Or use URL encoding (replace space with %20):
[reference guide](https://docs.python.org/3/reference%20guide)

For specifications, always stick with option 1 – find clean URLs without spaces.


Here's how to add helpful links to your specification:

# Weather App Specification

## Required Libraries
- [Python requests library](https://requests.readthedocs.io/) - for making API calls
- [OpenWeatherMap API](https://openweathermap.org/api) - free weather data source

## Data Format
Data should be formatted as JSON. See the [JSON specification](https://www.json.org/) for details.

## Testing
When you test your app, verify it works like the examples in the [OpenWeatherMap docs](https://openweathermap.org/current).

Now readers can click through and see:

  • How to use the requests library
  • Where to get weather data
  • What JSON looks like
  • What sample outputs should look like

This dramatically improves clarity because you're not just describing, you're showing where to find more information.


Here's what NOT to do:

[Check this out](https://docs.python.org/3/ with spaces)

The space in the middle of the URL breaks the link. The reader will see the text highlighted, but clicking won't work properly.

Fix it by removing spaces:

[Check the Python docs](https://docs.python.org/3/reference/)

Concept 3: Images - Showing What Things Look Like

Sometimes words aren't enough. You need to show what something looks like. That's where images come in.

Why Images Matter in Documentation

Images help readers understand:

  • What the UI looks like - Screenshots show expected interface
  • How data flows - Diagrams explain system architecture
  • Project branding - Logos make READMEs professional

Markdown images use almost the same syntax as links, with one difference - an exclamation mark ! at the start:

![alt text](image-url)
  • alt text = description of the image (shown if image doesn't load, read by screen readers)
  • image-url = where the image is located (web URL or local file path)

Example:

![Python logo](https://www.python.org/static/community_logos/python-logo.png)

Where Images Come From

Option 1: Online images (easiest for beginners) Use a direct image URL from the web:

![Example screenshot](https://example.com/screenshot.png)

Option 2: Local images in your project Put images in a folder (like images/ or assets/) and reference them:

![App screenshot](./images/screenshot.png)

For beginners: Start with online image URLs (from GitHub, documentation sites). Later you can add local images to your projects.

AI Native Approach: Let AI Help with Images

If you need diagrams or don't have good screenshots yet:

Ask your AI companion:

I need a simple architecture diagram showing: User → API → Database.
Can you suggest a tool to create this, or describe how I should visualize it?

AI can suggest tools (like Excalidraw, draw.io) or help you find placeholder images while building your spec.


Example: README with Image

Here's how images make READMEs more professional:

# Weather Dashboard

![Weather Dashboard Screenshot](https://via.placeholder.com/800x400.png?text=Weather+Dashboard)

## Problem
Users want quick access to local weather information.

## Features
- **Display** current temperature and conditions
- **Show** 7-day forecast
- **Save** favorite locations

## Screenshot
Here's what the app looks like in action:

![App interface](https://via.placeholder.com/600x300.png?text=App+Interface)

See how images make it immediately clear what the app looks like? That's powerful.

Common Image Mistakes

Mistake 1: Forgetting the ! at the start

[Missing exclamation](image.png)

This creates a link to the image, not an embedded image. Always use ![...] for images.

Mistake 2: Broken image paths

![Screenshot](screenshot.png)

If screenshot.png doesn't exist in the same folder, the image won't show. Check your paths!

Mistake 3: Too many large images

Don't embed 20 screenshots in one README. Use images strategically:

  • 1 logo/banner at the top
  • 1-2 key screenshots showing the app
  • Diagrams only when words aren't enough

Concept 4: Bringing It All Together - Your First Complete Specification

You now know:

  • Headings (Lesson 2) - to create document structure
  • Lists (Lesson 3) - to organize features and steps
  • Code blocks (Lesson 4) - to show expected output
  • Emphasis (Lesson 5) - to highlight what matters
  • Links (Lesson 5) - to connect to resources
  • Images (Lesson 5) - to show what things look like

When you combine all five elements, you create a specification document that both humans and AI can understand clearly.

Here's the complete picture:

# Smart Reminder App

## Problem
Users forget important deadlines because they have no easy way to track them.

## Solution
Build a **command-line reminder app** that:
- **Stores** reminders in a file
- **Checks** for upcoming reminders when the app starts
- **Notifies** users of due reminders

## Features
- **Create** a new reminder with date and time
- **List** all reminders, showing which are due soon
- **Mark** reminders as done
- **Delete** completed reminders

## Required Libraries
- [Python datetime module](https://docs.python.org/3/library/datetime.html) - for handling dates
- [Python JSON module](https://docs.python.org/3/library/json.html) - for storing reminders

## Expected Output
When a user lists reminders, they should see:

```python
Upcoming reminders:
1. Buy birthday gift - due TODAY
2. Call dentist - due in 3 days
3. Submit report - due in 5 days

Success Criteria

  • ✓ App runs from the command line
  • ✓ Reminders persist between sessions (saved to file)
  • ✓ All dates are validated (no invalid dates like "Feb 30")
  • ✓ Code is tested (ask an AI to review for bugs)

See what's happening here?

- **Headings** (`#`) create a clear structure
- **Bold** highlights feature names and important concepts
- **Lists** show features and criteria clearly
- **Code blocks** with language tags show expected output
- **Links** point to documentation

This is what an AI agent reads when you hand them your specification. Because it's **well-structured** and **clear**, the AI can generate more accurate code.

---

## Major Exercise: Task Tracker App (Part 4 - Complete It!)

**Final Step**: Open your Task Tracker App specification from Lessons 2-4. You'll now **complete it** by adding links, images, and emphasis — creating a publication-ready specification.

### Your Task for Lesson 5

Add the final polish to make your specification complete and professional:

**Part 1: Add Emphasis to Key Terms**

Go through your specification and add **bold** to important terms:

```markdown
## Problem
Users forget important tasks because they lack a **simple, offline** task management system.

## Features

### Add Tasks
- Create tasks with **title** and **description**
- Set **optional due dates**
- Assign **priority levels** (high, medium, low)

### View Tasks
- Display all tasks with **status indicator**
- Filter by priority or due date
- Show completed and pending **separately**

Part 2: Add Reference Links

Add a new section with helpful links:

## Reference Links
- [Python datetime documentation](https://docs.python.org/3/library/datetime.html) - for handling due dates
- [Python file I/O guide](https://docs.python.org/3/tutorial/inputoutput.html#reading-and-writing-files) - for saving tasks to file
- [Task management best practices](https://en.wikipedia.org/wiki/Task_management) - understanding task tracking concepts

Part 3: Add an Image (Optional but Recommended)

At the top of your spec, add a screenshot or diagram:

# Task Tracker App

![Task Tracker Screenshot](https://via.placeholder.com/800x200.png?text=Task+Tracker+CLI)

## Problem
...

Use a placeholder for now (https://via.placeholder.com/800x200.png?text=Task+Tracker+CLI). In real projects, you'd replace this with an actual screenshot.

Acceptance Criteria

Your complete Task Tracker App specification is ready when:

  • Heading hierarchy is correct (from Lesson 2): One # for title, ## for sections, ### for feature details
  • Lists are appropriate (from Lesson 3): Unordered for features, ordered for installation steps
  • Code blocks show expected output (from Lesson 4): At least one fenced block with text tag showing what the program prints
  • Emphasis highlights key terms (Lesson 5): Bold on important feature names and constraints
  • At least 3 working reference links (Lesson 5): Real URLs to Python docs or relevant resources
  • Image is included (Lesson 5 - optional): Placeholder or actual screenshot at the top
  • Overall clarity: Someone reading your spec could implement this app

Congratulations!

You've just completed a full specification built across 4 lessons:

  • Lesson 2: Created heading structure
  • Lesson 3: Added lists to organize features and steps
  • Lesson 4: Added code blocks to show expected behavior
  • Lesson 5: Added emphasis, links, and images for clarity

This is exactly how real specifications are built — iteratively, adding detail at each stage. You now have a publication-ready spec that an AI agent can implement.


Understanding Why This Matters

At this point, you've learned all the foundational markdown skills an AI-native developer needs. But more importantly, you understand why these skills matter:

  • Specifications are how you communicate intent to AI agents
  • Clarity in your specification directly affects code quality
  • Structure (headings, lists, code blocks) lets AI agents parse meaning
  • Emphasis and links guide attention to what matters most

You're not just learning markdown syntax – you're learning the Intent Layer of AIDD (AI-Driven Development). This is the core skill that makes everything else possible.

When you can write a clear specification, you can:

  • Generate code that matches your intent
  • Validate whether AI output is correct
  • Refine your spec when AI misunderstands
  • Collaborate with others (and AI) effectively

That's professional-level development – even if you haven't written a single line of code yet.


Try With AI

Now test your complete Task Tracker App specification with AI implementation and validation.

Setup

Use ChatGPT web (if you haven't set up another AI tool yet). If you already have Claude Code, Gemini CLI, or another AI companion installed from earlier chapters, feel free to use that instead.

Prompt Set

Prompt 1 (Initial Validation): Copy your complete Task Tracker App specification and paste it into ChatGPT. Then ask:

Read this Task Tracker App specification carefully. Is this specification
clear enough for you to implement? What parts are clear? What parts are
missing or confusing?

Expected Output: ChatGPT will tell you which parts are clear and which need more detail.

Prompt 2 (Clarity Refinement - If Needed): If ChatGPT found confusing parts, revise them and ask:

Based on your feedback, I've revised the unclear parts.
Here's my improved specification: [paste updated spec]

Is this clearer now? Can you identify any remaining issues?

Prompt 3 (Full Implementation Test): Once your spec is validated, ask:

Implement this Task Tracker App specification in Python.
Make the code match the expected output I showed in the spec.
Include all features I listed.

Your Task: Execute and Validate (CRITICAL STEP)

After ChatGPT generates code, apply the full verification framework:

Step 1: Read the Code

  • Ask ChatGPT: "Explain the structure of this code. How does it implement each feature I specified?"

Step 2: Save and Run

  • Save to task_tracker.py
  • Run: python task_tracker.py
  • Actually execute it — this is not optional!

Step 3: Compare to Your Specification Go through your spec section by section:

  • Does the menu match your Expected Output?
  • Do all 4 features (Add/View/Mark/Delete) work?
  • Does the output format match exactly?
  • Are the installation steps correct?

Step 4: Identify Gaps If anything doesn't match:

  1. Which part of your spec was unclear? (Be specific: "I said 'show tasks' but didn't specify format")
  2. How would you revise it? (Example: Change to "Show tasks in format: 1. Task name [Status] - Due: Date")
  3. Ask ChatGPT to regenerate using your clearer spec

Step 5: Verify Against Framework

  1. Check against what you know: Does the implementation use concepts from earlier chapters (Python basics, file I/O)?
  2. Ask for reasoning: "Why did you implement Add Tasks this way? How does it match my spec?"
  3. Test edge cases: What happens if you try to delete a non-existent task? Did your spec cover this?
  4. Cross-reference: Search GitHub for "Python task tracker CLI" — how does yours compare?

Reflection

Write a reflection (5-7 sentences) answering:

  • Did your complete specification (built across 4 lessons) produce working code?
  • What gaps did you discover only after running the code?
  • How would you improve your specification if you wrote it again?
  • Most critical: What did running the code teach you that asking "Is this clear?" never would have?

Why This Full-Cycle Validation Matters

This exercise teaches the complete AI-native development loop:

  1. Write specification (Lessons 2-5) → Intent Layer
  2. AI generates code (Prompt 3) → Reasoning + Implementation Layers
  3. Execute and compare (Validation steps) → Testing Layer
  4. Identify gaps (Verification framework) → Iteration Layer
  5. Revise spec (Based on execution results) → Back to Intent Layer

You're not just learning markdown — you're learning the entire workflow of specification-driven AI development.

If you blindly accepted ChatGPT saying "This spec is clear!" without running the code, you'd miss critical lessons about:

  • How vague language creates ambiguous implementations
  • How missing edge cases lead to broken software
  • How precise specifications produce reliable code

This is professional-level validation. Even with all the markdown skills you've learned, the real test is: Does your spec produce working code?