DEV Community

Cover image for Prompt Engineering at Workplace: How I Used Amazon Q Developer to Boost Team Productivity by 30%
zaina ahmed
zaina ahmed

Posted on

Prompt Engineering at Workplace: How I Used Amazon Q Developer to Boost Team Productivity by 30%

If someone had told me a year ago that learning how to talk to an AI would become one of the most valuable skills in my engineering career, I probably would have laughed. But here I am. A certified Prompt Engineer at Ericsson, having driven a 30% improvement in team developer efficiency simply by integrating AI-assisted tooling into our daily workflows.

This is the story of how it happened, what I learned, and how you can apply it too.

A Little Background

I'm Zaina, a Software Engineer at Ericsson, one of the world's largest telecommunications technology companies. My day-to-day work involves building Java microservices, working with Apache Kafka event-driven systems, and maintaining CI/CD pipelines at enterprise scale.

Like most engineers, I was skeptical when AI coding tools started appearing. "It'll write buggy code." "You still have to review everything." "It's just autocomplete on steroids."

All of that is partially true. But I was missing the bigger picture.

🤖 What Is Amazon Q Developer?

Amazon Q Developer is AWS's AI-powered coding assistant, think of it as an engineer who:

  • Writes boilerplate code on demand
  • Explains complex codebases in plain English
  • Generates documentation automatically
  • Suggests fixes for bugs and code smells
  • Answers questions about AWS services instantly

It's similar to GitHub Copilot but with deeper AWS and enterprise integration which made it a natural fit for Ericsson's infrastructure.

💡 What Is Prompt Engineering really?

Most people think prompt engineering means typing clever questions into ChatGPT. In a professional engineering context, it's much more structured than that.

Prompt engineering is the discipline of crafting precise, contextual instructions that consistently produce accurate, useful AI outputs.

Think of it like writing a function specification. A vague spec gives you vague code. A precise spec gives you exactly what you need.

Bad prompt:

Write documentation for this function
Enter fullscreen mode Exit fullscreen mode

Good prompt:

Write technical documentation for the following Java Spring Boot function. 
Format it as a Javadoc comment. Include: purpose, parameters, return value, 
exceptions thrown, and one usage example. Keep it under 150 words.

public NotificationResponse sendNotification(String userId, String message, NotificationType type) {
    validateInput(userId, message);
    Notification notification = notificationBuilder.build(userId, message, type);
    return notificationService.dispatch(notification);
}
Enter fullscreen mode Exit fullscreen mode

The difference in output quality is dramatic.

🔧 How I Applied It at Ericsson

1. Documentation Generation

Our team had a backlog of undocumented internal APIs, a classic engineering problem. Manually writing Javadoc for 50+ methods would have taken days.

Using Amazon Q Developer with a structured prompt template, I reduced that to hours. The key was creating a reusable prompt template that the whole team could use:

Prompt I used:

You are a senior Java engineer writing internal API documentation.
Given the following Spring Boot method, generate a Javadoc comment that includes:
- One sentence summary
- @param descriptions for each parameter
- @return description
- @throws for any exceptions
- Keep professional tone, under 100 words

public ServiceResponse registerPodConnection(
    String podId, 
    String targetServiceId, 
    ConnectionConfig config) throws PodConnectionException {

    if (!podRegistry.exists(podId)) {
        throw new PodConnectionException("Pod not found: " + podId);
    }
    ConnectionResult result = connectionManager.establish(podId, targetServiceId, config);
    return ServiceResponse.success(result);
}
Enter fullscreen mode Exit fullscreen mode

AI Output:

/**
 * Registers a new connection between a pod and a target service using the provided configuration.
 *
 * @param podId           the unique identifier of the source pod
 * @param targetServiceId the identifier of the service to connect to
 * @param config          the connection configuration including timeout and retry settings
 * @return a {@link ServiceResponse} containing the result of the connection attempt
 * @throws PodConnectionException if the specified pod does not exist in the registry
 *
 * Example usage:
 * ServiceResponse response = registerPodConnection("pod-001", "kafka-service", defaultConfig);
 */
Enter fullscreen mode Exit fullscreen mode

Result: Documentation that previously took 3 hours took 25 minutes.

2. Test Case Generation

Writing unit tests is important but time-consuming. I used Amazon Q Developer to generate JUnit test skeletons:

Prompt I used:

Generate JUnit 5 test cases for the following Java method.
Include:
- Happy path test
- Null input edge case
- Boundary value test
- Use Mockito for dependencies
- Follow AAA pattern (Arrange, Act, Assert)
- Add descriptive test method names

public KafkaMessage buildNotificationEvent(String userId, String payload, EventType eventType) {
    if (userId == null || payload == null) {
        throw new IllegalArgumentException("userId and payload must not be null");
    }
    return KafkaMessage.builder()
        .userId(userId)
        .payload(payload)
        .eventType(eventType)
        .timestamp(Instant.now())
        .build();
}
Enter fullscreen mode Exit fullscreen mode

AI Output:

@ExtendWith(MockitoExtension.class)
class NotificationEventBuilderTest {

    @InjectMocks
    private NotificationEventBuilder eventBuilder;

    @Test
    @DisplayName("Should build Kafka message successfully with valid inputs")
    void shouldBuildKafkaMessageSuccessfully() {
        // Arrange
        String userId = "user-123";
        String payload = "{\"message\": \"Hello\"}";
        EventType eventType = EventType.PUSH;

        // Act
        KafkaMessage result = eventBuilder.buildNotificationEvent(userId, payload, eventType);

        // Assert
        assertNotNull(result);
        assertEquals(userId, result.getUserId());
        assertEquals(payload, result.getPayload());
        assertEquals(eventType, result.getEventType());
        assertNotNull(result.getTimestamp());
    }

    @Test
    @DisplayName("Should throw IllegalArgumentException when userId is null")
    void shouldThrowExceptionWhenUserIdIsNull() {
        // Arrange
        String userId = null;
        String payload = "{\"message\": \"Hello\"}";

        // Act & Assert
        assertThrows(IllegalArgumentException.class, () ->
            eventBuilder.buildNotificationEvent(userId, payload, EventType.PUSH)
        );
    }

    @Test
    @DisplayName("Should throw IllegalArgumentException when payload is null")
    void shouldThrowExceptionWhenPayloadIsNull() {
        // Arrange
        String userId = "user-123";
        String payload = null;

        // Act & Assert
        assertThrows(IllegalArgumentException.class, () ->
            eventBuilder.buildNotificationEvent(userId, payload, EventType.PUSH)
        );
    }

    @Test
    @DisplayName("Should throw IllegalArgumentException when payload is empty string")
    void shouldHandleEmptyPayloadAsEdgeCase() {
        // Arrange
        String userId = "user-123";
        String payload = "";

        // Act
        KafkaMessage result = eventBuilder.buildNotificationEvent(userId, payload, EventType.SMS);

        // Assert
        assertNotNull(result);
        assertEquals("", result.getPayload());
    }
}
Enter fullscreen mode Exit fullscreen mode

The generated tests still needed review and refinement but having a solid skeleton cut test writing time by roughly half.

3. Code Review Assistance

Before submitting code for peer review, I started running my changes through Amazon Q Developer:

Prompt I used:

Review the following Java code for:
1. Potential bugs or logic errors
2. Performance issues
3. Security vulnerabilities
4. Violations of SOLID principles
5. Missing error handling

Provide specific line-by-line feedback where relevant.

@Service
public class KafkaConsumerService {

    private final NotificationRepository repository;

    public void processMessage(ConsumerRecord<String, String> record) {
        String payload = record.value();
        ObjectMapper mapper = new ObjectMapper();
        Notification notification = mapper.readValue(payload, Notification.class);
        repository.save(notification);
        System.out.println("Saved: " + notification.getId());
    }
}
Enter fullscreen mode Exit fullscreen mode

AI Output:

Code Review Feedback:

1. LINE 9 — PERFORMANCE: ObjectMapper is being instantiated on every method call.
   This is expensive. Inject it as a Spring Bean instead.
   Fix: @Autowired ObjectMapper mapper;

2. LINE 10 — ERROR HANDLING: mapper.readValue() throws JsonProcessingException 
   but it is not handled. Add try-catch or declare in method signature.

3. LINE 12 — LOGGING: System.out.println should be replaced with SLF4J logger.
   Fix: private static final Logger log = LoggerFactory.getLogger(KafkaConsumerService.class);

4. LINE 7 — SOLID: processMessage() is doing too much (parsing + saving + logging).
   Consider separating parsing logic into a dedicated MessageParser class.

5. MISSING: No null check on record.value() before parsing.
   If payload is null, mapper.readValue() will throw NullPointerException.
Enter fullscreen mode Exit fullscreen mode

This caught several issues before they reached my colleagues, improving code quality and reducing review back-and-forth.

4. Knowledge Base Articles

Prompt I used:

Write an internal engineering guide explaining Avro Schema Evolution 
for a team of Java backend engineers. Assume they are familiar with 
Spring Boot and Kubernetes but new to Avro schemas. Include:
- Overview (2 paragraphs)
- Step by step setup
- Common pitfalls
- Example configuration
Keep it under 500 words, technical but approachable.
Enter fullscreen mode Exit fullscreen mode

AI Output:

# Avro Schema Evolution — Internal Engineering Guide

## **Overview**
Apache Avro is a binary serialisation format used with Apache Kafka to define 
the structure of messages exchanged between services. Unlike JSON, Avro schemas 
are strongly typed and versioned, meaning changes to the schema must be managed 
carefully to avoid breaking existing consumers.

Schema evolution allows you to update your Avro schema over time — adding fields, 
removing fields, or changing defaults without breaking services that depend on 
older versions. The Schema Registry (Confluent) acts as the central store for all 
schema versions and enforces compatibility rules.

## **Step-by-Step Setup**

1. Add dependency to pom.xml:
<dependency>
    <groupId>io.confluent</groupId>
    <artifactId>kafka-avro-serializer</artifactId>
    <version>7.4.0</version>
</dependency>

2. Define your initial schema (notification.avsc):
{
  "type": "record",
  "name": "Notification",
  "namespace": "com.ericsson.notifications",
  "fields": [
    {"name": "userId", "type": "string"},
    {"name": "message", "type": "string"},
    {"name": "timestamp", "type": "long"}
  ]
}

3. Add new optional field safely (backward compatible):
{"name": "priority", "type": ["null", "string"], "default": null}

## **Common Pitfalls**
- Never remove a required field — use ["null", "type"] union with default instead
- Never rename a field — add a new field and deprecate the old one
- Always set default values for new fields to maintain backward compatibility

## **Compatibility Rules**
- BACKWARD: New schema can read old data (safest, recommended)
- FORWARD: Old schema can read new data
- FULL: Both directions supported
Enter fullscreen mode Exit fullscreen mode

What used to take an afternoon became a 30-minute task.

📊 The Results

After three months of consistent use and sharing prompt templates with the team:

Metric Before After
Documentation time ~3 hrs ~25 mins
Test skeleton writing ~2 hrs ~45 mins
Knowledge base articles ~4 hrs ~1 hr
Developer efficiency (quarterly review) Baseline +30% improvement

The 30% figure came from our quarterly engineering efficiency review.
A real, measured outcome, not an estimate.

🎓 Getting Certified

Ericsson offered a formal Prompt Engineering for Professionals (BCSS BOS) certification, which I completed in November 2024. The certification covered:

  • Fundamentals of large language models
  • Prompt design patterns
  • Chain-of-thought prompting
  • Few-shot and zero-shot techniques
  • Responsible AI usage in professional environments

If your company offers AI training, take it! If not, there are excellent free resources on platforms like Coursera, DeepLearning.AI, and AWS Skill Builder.

My Top 5 Prompt Engineering Tips for Engineers

1. Be specific about format
Always tell the AI exactly what format you want — bullet points, table, Javadoc, code block.

2. Give it a role
Start with "You are a senior Java engineer..." — this sets the context and dramatically improves output quality.

3. Set constraints
"Under 100 words", "Java 17 only", "no more than 3 bullet points" — constraints prevent verbose responses.

4. Use examples
Show the AI one example of what good output looks like before asking it to generate more.

5. Always review
AI output is a first draft, not a final answer. Treat it like code from a junior developer — useful, but needs your expert eye.

🔮 What's Next

AI-assisted development is not a trend, it's the new baseline. The engineers who learn to use these tools effectively will move faster, produce higher quality work, and have more time for the complex problems that actually require human creativity.

I'm now exploring how prompt engineering applies to Kafka event schema design, automated test generation for distributed systems, and CI/CD pipeline optimisation.

If you're an engineer who hasn't yet experimented with AI-assisted tools, start today. The learning curve is smaller than you think, and the productivity gains are very real.

Thanks for reading! I'm Zaina, a Software Engineer based in Perth, Australia, working with Java microservices, Apache Kafka, and cloud-native technologies. Connect with me on LinkedIn or check out my portfolio.

If this article helped you, drop a ❤️ and share it with a fellow engineer!

Top comments (0)