Chatbot: Claude Sonnet 3.5
Testbed: the US Weather API.
Task:
- Try out different queries and see how usable the result is
- Develop my skills using AI chatbots
Experiments:
- ran the same query over and over on the same chatbot
- ran the same query on different chatbots
- made queries increasingly specific
- tried different ways to make the queries more specific
Focus:
- Generating test data
I am also exploring using AI with test planning and automation, and I hope to share the results with you in later weeks.
I encountered a few issues. For example:
- requesting a list of test data that DIDN'T match a regex, the very first item DID match that regex (although the rest did not).
- requesting a list of test data that DIDN'T match a regex, two items did match the regex
- the Claude Sonnet training data was less extensive than I expected, and results got pretty deterministic surprisingly fast.
References:
I took some inspiration from prompt engineering examples I've seen, including from Mark Winteringham's book, Software Testing with Generative AI, as well as my own attempts at creating some useful queries.
Test Data
Functions
Create a JavaScript function that returns an array of UTC DateTimes from the current date to 7 days ago, at 24-hour intervals, starting with mid-day of the current date.
function getLastWeekDatesFromMidDay() {
const dates = [];
const now = new Date();
// Set the time to 12:00:00 UTC for today
now.setUTCHours(12, 0, 0, 0);
for (let i = 0; i < 8; i++) {
const date = new Date(now);
date.setUTCDate(now.getUTCDate() - i);
dates.push(date.toISOString().slice(0, 19) + 'Z');
}
return dates.reverse();
Regular expressions
Can you create a list of dates in date formats that do not match yyyy-mm-dd hh:mm:ss for the DateTime 2020-05-14T05:40:44Z?
invalidDateFormats": [
"2020-05-14",
"2020-05-14T",
"2020-05-14T5",
"2020-05-14T05",
"2020-05-14T05:40",
"2020-05-14T05:40:4",
"14/05/2020 05:40:44",
"05/14/2020 05:40:44",
"2020/05/14 05:40:44",
"14-05-2020 05:40:44",
"05-14-2020 05:40:44",
"20200514054044",
"14052020054044",
"May 14, 2020 05:40:44",
"14 May 2020 05:40:44",
"Thu, 14 May 2020 05:40:44",
"2020-05-14 05:40:44",
"05:40:44 14-05-2020",
"1589433644",
"20-05-14 05:40:44",
"2020.05.14 05:40:44",
"14/05/20 05:40:44",
"05/14/20 05:40:44"
Can you give a list of text that does not match this regular expression? [\"AK\",\"AL\",\"AM\",\"AN\",\"AR\",\"AS\",\"AZ\",\"CA\",\"CO\",\"CT\",\"DC\",\"DE\",\"FL\",\"FM\",\"GA\",\"GM\",\"GU\",\"HI\",\"IA\",\"ID\",\"IL\",\"IN\",\"KS\",\"KY\",\"LA\",\"LC\",\"LE\",\"LH\",\"LM\",\"LO\",\"LS\",\"MA\",\"MD\",\"ME\",\"MH\",\"MI\",\"MN\",\"MO\",\"MP\",\"MS\",\"MT\",\"NC\",\"ND\",\"NE\",\"NH\",\"NJ\",\"NM\",\"NV\",\"NY\",\"OH\",\"OK\",\"OR\",\"PA\",\"PH\",\"PK\",\"PM\",\"PR\",\"PS\",\"PW\",\"PZ\",\"RI\",\"SC\",\"SD\",\"SL\",\"TN\",\"TX\",\"UT\",\"VA\",\"VI\",\"VT\",\"WA\",\"WI\",\"WV\",\"WY\"]"
[
"AK",
"ak",
"Ak",
"aK",
"AAK",
"AKA",
"A K",
"AK ",
" AK",
"AK,",
"AK.",
"AK-",
"AK_",
"(AK)",
"[AK]",
"{AK}",
"\"AK\"",
"AK1",
"1AK",
"AKAK",
"AL-AK",
"Alaska",
"ALASKA",
"USA",
"US",
"United States",
"123",
"AB",
"ZZ",
"XX",
"AA",
"ak,al",
"AK AL",
"AK.AL",
"AK;AL",
"A-K",
"a.k.",
"KA",
"_AK",
"AK_",
"-AK",
"AK!",
"!AK",
"AK?",
"?AK",
null,
"",
" ",
"\\AK",
"AK\\",
"AK\nAL",
"AK\tAL"
]
Can you give me test data that does not match the regular expression in the line below, and can you provide it in JSON format in an object named "invalidZoneCodes"? "^(A[KLMNRSZ]|C[AOT]|D[CE]|F[LM]|G[AMU]|I[ADLN]|K[SY]|L[ACEHMOS]|M[ADEHINOPST]|N[CDEHJMVY]|O[HKR]|P[AHKMRSWZ]|S[CDL]|T[NX]|UT|V[AIT]|W[AIVY]|[HR]I)[CZ]\d{3}$"
"invalidZoneCodes": [
"AKC1234",
"NYC123",
"CAZ12",
"CAX123",
"ca123",
"AK 123",
"AKCZ123",
"AKC12A",
"AKC-123",
"AKZC123",
"123AKC",
"AKC",
"123",
"AKCA23",
"AK123C",
"AKC1Z3",
"AKZ000",
"RIC1234",
"HIC1234",
"AK123Z",
"UTCABC",
"VAZ1234",
"WYC12",
"ORC12Z",
"MNC123C"
]
Can you give me test data that does match the regular expression in the line below, and can you provide it in JSON format in an object named "validZoneCodes"? "^(A[KLMNRSZ]|C[AOT]|D[CE]|F[LM]|G[AMU]|I[ADLN]|K[SY]|L[ACEHMOS]|M[ADEHINOPST]|N[CDEHJMVY]|O[HKR]|P[AHKMRSWZ]|S[CDL]|T[NX]|UT|V[AIT]|W[AIVY]|[HR]I)[CZ]\d{3}$"
"validZoneCodes": [
"AKZ123",
"CAC456",
"DEZ789",
"FLZ012",
"GAC345",
"ILC678",
"KSZ901",
"LAC234",
"MNC567",
"NJZ890",
"OHC123",
"PAZ456",
"SDC789",
"TXZ012",
"UTC345",
"VAZ678",
"WAC901",
"HIZ234",
"RIZ567",
"AZC890",
"COZ123",
"DCZ456",
"GMC789",
"IAZ012",
"KYC345"
]
Classification
Can you create a map of the parameter keys and values used in this API response? After that, can you generate test data for those keys? Your purpose is to check the integrity of the data input by the weather stations, as the migration has not yet completed, and some agencies are using ENUMs for some values, and others are still using plain text. What data would you classify as valid, and what data would you classify as invalid, considering the training data. It should cover valid and invalid cases, with a focus on big and little. Please take into account the fact that null and empty fields are not invalid.
Classification of Valid and Invalid Data:
Valid Data:
ENUM values and their plain text equivalents
Numeric values with appropriate units
Null values
Empty arrays
Invalid Data:
Values not matching ENUM or expected plain text
Numeric values with incorrect units or format
Non-numeric values for numeric fields
Test Data:
windThreat:
Valid (Big): ["RADAR INDICATED", "OBSERVED", null, "", "POSSIBLE"]
Valid (Little): ["RADAR INDICATED"], [], null
Invalid (Big): ["RADAR_INDICATED", "STRONG", "WEAK", "UNKNOWN", 123]
Invalid (Little): ["INVALID"]
maxWindGust:
Valid (Big): ["70 MPH", "100 KPH", "50 MPS", null, ""]
Valid (Little): ["1 MPH"], [], null
Invalid (Big): ["500 MPH", "-10 MPH", "100", "STRONG", "70MPH"]
Invalid (Little): ["GUSTY"]
hailThreat:
Valid (Big): ["RADAR INDICATED", "OBSERVED", null, "", "POSSIBLE"]
Valid (Little): ["RADAR INDICATED"], [], null
Invalid (Big): ["RADAR_INDICATED", "LARGE", "SMALL", "UNKNOWN", 123]
Invalid (Little): ["INVALID"]
maxHailSize:
Valid (Big): ["0.88", "1.00", "2.75", null, ""]
Valid (Little): ["0.25"], [], null
Invalid (Big): ["10.00", "-1.00", "GOLF BALL", "LARGE", "0,88"]
Invalid (Little): ["SMALL"]
thunderstormDamageThreat:
Valid (Big): ["CONSIDERABLE", "NONE", "MODERATE", null, ""]
Valid (Little): ["NONE"], [], null
Invalid (Big): ["VERY_CONSIDERABLE", "UNKNOWN", "LIGHT", "HEAVY", 123]
Invalid (Little): ["INVALID"]
tornadoDetection:
Valid (Big): ["RADAR INDICATED", "OBSERVED", null, "", "POSSIBLE"]
Valid (Little): ["RADAR INDICATED"], [], null
Invalid (Big): ["RADAR_INDICATED", "STRONG", "WEAK", "UNKNOWN", 123]
Invalid (Little): ["INVALID"]
You are a software tester testing data integrity for a weather data API. Can you suggest a set of test data for the value "maximum hail size"? It should cover positive and negative cases. The positive cases should include edge cases.
"maximumHailSize": {
"validCases": [
0.00,
0.25,
0.50,
0.75,
1.00,
1.25,
1.50,
1.75,
2.00,
2.50,
3.00,
4.00,
4.50,
5.00,
6.00,
7.00,
8.00
],
"edgeCases": [
0.01,
0.10,
9.99,
10.00
],
"invalidCases": [
-0.01,
-1.00,
10.01,
15.00,
100.00,
0.001,
0.999,
"0",
"1.5",
"",
" ",
null,
undefined,
true,
false,
[],
{},
Infinity,
-Infinity,
NaN,
1.5e2,
"trace",
"1.5 inches",
"1,50",
"1/2",
"ping pong",
"golf ball"
]
}
You are a software tester testing data integrity for a weather data API. You are generating tests for the nightly run checking that the data entered by various weather stations is valid. Can you suggest a set of test data for the value "maximum hail size"? It should cover positive, negative, edge, creative, big, little, invalid, exploratory, and boundary-related cases. It should also check for external interference with the system in a non-destructive manner.
"maximumHailSize": {
"validCases": [
0.00,
0.25,
0.50,
1.00,
1.75,
2.50,
4.00,
6.00,
8.00
],
"edgeCases": [
0.01,
7.99,
8.00
],
"bigCases": [
100.00,
1000.00,
9999.99
],
"littleCases": [
0.001,
0.009,
0.01
],
"invalidCases": [
-0.01,
-1.00,
8.01,
"0",
"",
" ",
null,
undefined,
true,
false,
[],
{},
Infinity,
-Infinity,
NaN
],
"creativeInputs": [
"golf ball",
"baseball",
"grapefruit",
"1,50",
"1/2",
"1.5 inches",
"trace",
"melted",
"unknown"
],
"exploratoryInputs": [
1.5e2,
"0xFF",
"0b1010",
"0o777",
"\u0037\u002E\u0035",
" 7.5 ",
"7.5\n",
"7.5\t",
"+7.5",
"7.5"
],
"boundaryTests": [
-0.01,
0.00,
0.01,
7.99,
8.00,
8.01
],
"externalInterference": [
"<script>alert('XSS')</script>",
"'; DROP TABLE weather_data; --",
"%00%3Cscript%3Ealert(%27XSS%27)%3C/script%3E",
"../../../etc/passwd",
"{{7*7}}",
"${7*7}",
"#{7*7}",
"=7+7",
"@{7*7}"
]
}
I want you to act as an expert software tester who works on creating test data to provide comprehensive test data coverage.
I want you to generate tests for the nightly run checking that the data entered by various weather stations is valid.
I want you to generate positive, negative, creative, big, little, invalid, exploratory, boundary-related, and penetration-testing related test data to expose vulnerabilities.
Here are some common types of test data attacks that you can also learn from and incorporate while creating our own test data:
- Paths/Files: Long Name (>255 chars), Special Characters in Name, Non-Existent characters, Character with No Space.
- Time and Date: Crossing Time Zones, Leap Days, Always Invalid Days (Feb 30, Sept 31), Feb 29 in Non-Leap Years, Different Formats (June 5, 2001; 06/05/2001; 06/05/01; 06-05-01; 6/5/2001 12:34), Internationalisation dd.mm.yyyy, mm/dd/yyyy, am/pm, Daylight Savings Changeover.
- Numbers: 0, 32768 (215), 32769 (215 + 1), 65536 (216), 65537 (216 +1), 2147483648 (231), 2147483649 (231 + 1), 4294967296 (232), 4294967297 (232 + 1), Scientific Notation (1E-16), Negative, Floating Point/Decimal (0.0001), With Commas (1,234,567), European Style (1.234.567,89).
- Strings: Long (255, 256, 257, 1000, 1024, 2000, 2048 or more characters), Accented Chars (àáâãäåçèéêëìíîðñòôõöö, etc.), Asian Characters Common Delimiters and Special Characters ( “ ‘ ` | / \ , ; : & < > ^ * ? Tab ), Leave Blank, Single Space, Multiple Spaces, Leading Spaces, SQL Injection ( ‘select * from customer), Emojis.
Provide the results in json format.
I want you to generate as many cases as you can think of for each test case type.
These are the variable names to create test data for: "maximum hail size"
{
"type": "Positive",
"value": 1.0,
"description": "Valid small hail size in inches"
},
{
"type": "Positive",
"value": 2.5,
"description": "Valid medium hail size in inches"
},
{
"type": "Positive",
"value": 4.75,
"description": "Valid large hail size in inches"
},
{
"type": "Positive",
"value": 8.0,
"description": "Valid extreme hail size in inches (near record)"
},
{
"type": "Boundary",
"value": 0.0,
"description": "Minimum possible size"
},
{
"type": "Boundary",
"value": 0.01,
"description": "Just above minimum size"
},
{
"type": "Boundary",
"value": 9.99,
"description": "Very large, but still plausible"
},
{
"type": "Boundary",
"value": 10.0,
"description": "Maximum plausible size"
},
{
"type": "Negative",
"value": -1.0,
"description": "Invalid negative size"
},
{
"type": "Negative",
"value": -0.01,
"description": "Small negative value"
},
{
"type": "Invalid",
"value": 100.0,
"description": "Unrealistically large size"
},
{
"type": "Invalid",
"value": 1000.0,
"description": "Absurdly large size"
},
{
"type": "Invalid",
"value": "abc",
"description": "Non-numeric input"
},
{
"type": "Invalid",
"value": "1,5",
"description": "Comma instead of decimal point"
},
{
"type": "Invalid",
"value": "1.5.5",
"description": "Multiple decimal points"
},
{
"type": "Invalid",
"value": "1.5\"",
"description": "Unit included in the value"
},
{
"type": "Invalid",
"value": "1/2",
"description": "Fraction instead of decimal"
},
{
"type": "Creative",
"value": 3.14159,
"description": "Pi as hail size"
},
{
"type": "Creative",
"value": 9.8,
"description": "Gravity acceleration as hail size"
},
{
"type": "Creative",
"value": 299792458,
"description": "Speed of light as hail size"
},
{
"type": "Big",
"value": 999999999.99,
"description": "Very large number"
},
{
"type": "Little",
"value": 0.000000001,
"description": "Very small positive number"
},
{
"type": "Exploratory",
"value": "1E-6",
"description": "Scientific notation"
},
{
"type": "Exploratory",
"value": "1.5e2",
"description": "Scientific notation for large value"
},
{
"type": "Exploratory",
"value": "+2.5",
"description": "Explicit positive sign"
},
{
"type": "Exploratory",
"value": "0x1A",
"description": "Hexadecimal value"
},
{
"type": "Exploratory",
"value": "0b1010",
"description": "Binary value"
},
{
"type": "Boundary",
"value": 32767,
"description": "2^15 - 1 (max 16-bit signed integer)"
},
{
"type": "Boundary",
"value": 32768,
"description": "2^15"
},
{
"type": "Boundary",
"value": 65535,
"description": "2^16 - 1 (max 16-bit unsigned integer)"
},
{
"type": "Boundary",
"value": 2147483647,
"description": "2^31 - 1 (max 32-bit signed integer)"
},
{
"type": "Invalid",
"value": null,
"description": "Null value"
},
{
"type": "Invalid",
"value": "",
"description": "Empty string"
},
{
"type": "Invalid",
"value": " ",
"description": "Space character"
},
{
"type": "Invalid",
"value": " 2.5 ",
"description": "Value with leading/trailing spaces"
},
{
"type": "Penetration",
"value": "'; DROP TABLE weather_data; --",
"description": "SQL injection attempt"
},
{
"type": "Penetration",
"value": "<script>alert('XSS')</script>",
"description": "Cross-site scripting attempt"
},
{
"type": "Internationalization",
"value": "2,5",
"description": "European decimal format"
},
{
"type": "Internationalization",
"value": "२.५",
"description": "Hindi numerals"
},
{
"type": "Internationalization",
"value": "二.五",
"description": "Chinese numerals"
},
{
"type": "Time-related",
"value": "1.5@12:00",
"description": "Size with time"
},
{
"type": "Date-related",
"value": "2.0@2024-02-29",
"description": "Size with leap day date"
},
{
"type": "Invalid",
"value": "🌨️",
"description": "Weather-related emoji"
},
{
"type": "Invalid",
"value": "∞",
"description": "Infinity symbol"
},
{
"type": "Invalid",
"value": "NaN",
"description": "Not a Number"
},
{
"type": "Creative",
"value": "MAXIMUM",
"description": "Text instead of number"
},
{
"type": "Creative",
"value": -273.15,
"description": "Absolute zero temperature as size"
}
Top comments (0)