Prompting Is Professional Judgment, Not a Parlor Trick

Every time a new research or drafting tool enters the legal profession, the same pattern appears. First, people treat it like magic. Then they treat it like a threat. Eventually, we realize it’s neither. It’s just another tool that requires competence.

Prompting artificial intelligence is no different.

Lawyers have always been cautious about new tools. I sometimes joke that if lawyers had existed when the wheel was invented, we would have said we couldn’t possibly use it. It goes too fast. It hasn’t been tested. Someone is going to get hurt.

What’s new is not the need for judgment. What’s new is the interface.

We’ve Been Here Before

When online legal research became widespread, lawyers had to learn Boolean searching. Knowing the law was not enough. You had to know how to tell the database what you were looking for. AND. OR. NOT. Proximity connectors. Field restrictions.

Earlier generations of lawyers learned very quickly that inefficient searching had real costs. When searches were billed per query, precision mattered. Poorly constructed queries produced poor results, and no one blamed Westlaw or Lexis for that. The responsibility belonged to the lawyer.

Later, plain-language searching became more common. That did not eliminate the need for judgment. It simply changed how lawyers expressed their questions. You still had to decide how broad or narrow to be, what assumptions you were making, and whether the results actually answered the question you thought you were asking.

Then came Google. Lawyers had to learn that searching the open web is not the same as searching a curated legal database. You learned to evaluate sources, recognize persuasive but unreliable material, and understand when search results were confident and wrong.

Each of those shifts required new skills layered on top of existing professional judgment.

Prompting AI fits squarely into that pattern.

Prompting Is Instruction, Not Clever Phrasing

A good prompt is not about finding the perfect wording. It is about giving clear instructions.

When lawyers prompt AI tools, they are doing something familiar. They are delegating a task. And just as with a junior lawyer, paralegal, or researcher, the quality of the instruction determines the quality of the work product.

That means being clear about the task, the constraints, and the assumptions. It also means understanding what the tool can do well and where it falls short.

Vague prompts produce vague output. Overly broad prompts produce confident answers that may or may not be grounded in reality. That is not a failure of the technology. It is a failure of instruction.

As with every other tool lawyers use, responsibility for the result remains with the lawyer.

Why Prompting Feels Different

Prompting feels unfamiliar because it looks conversational. That can make people forget that the system on the other end has no judgment of its own.

The AI does not know what matters. It does not know what is risky. It does not know what is missing. It will not tell you when it is guessing. The only way to recognize a guess is to already know enough to spot it.

This is why treating prompting as a shortcut or a trick is so dangerous. It suggests that phrasing can replace judgment. It cannot.

What matters is understanding the legal issue, knowing what kind of output is appropriate, and being able to evaluate what comes back. Those are lawyer skills. They always have been.

The Ethical Throughline Has Not Changed

Nothing about AI changes a lawyer’s ethical obligations. Lawyers are still required to understand the tools they use, supervise work product, verify accuracy, and exercise independent judgment.

Prompting does not shift responsibility to the tool. Saying “the AI did it” is no more defensible than saying “the database gave me the wrong case.”

The tool assists. The lawyer decides.

In short, prompting artificial intelligence requires the same kind of professional judgment lawyers have always exercised when working with new tools.

Learning Prompting Is Part of Competence

Just as lawyers once had to learn how to search legal databases and the open web, they now have to learn how to communicate effectively with AI tools. That learning curve is normal. It is not evidence that the technology is unmanageable or that lawyers are being replaced.

It is evidence that professional competence evolves.

Prompting is not a parlor trick. It is the modern expression of something lawyers have always needed to do well: know what to ask, know how to ask it, and know how to evaluate the answer.

Judgment is still the point.

Subscribe to My Blog

Get notified when I publish new posts.

Please wait...

Thank you for subscribing.

Categories