AI Survey system based on EdgeOne Pages.
Features:
- Multilingual (Chinese/English)
- AI-supported survey crafting / results analysis
- Multiple question types
- WYSIWYG survey editor with live preview
- Advanced question condition editor
- one entry per IP
- mobile optimized
Edgeone Pages functionality used:
- Pages KV
- Pages AI
- Pages Functions
Design considerations:
This is AI-assisted project with classic frontend/backend separation.
Frontend uses font-awesome for icons and AI homemade css &js , not referencing other libraries but it looks perfectly professional and good so I kept the design.
Backend uses Pages functions , since frontend does all the heavy lift, backend apis are relatively simple.Some challenges exists though:
How to save responses
Pages KV is a simple key-value database , it's not designed to be powerful for a survey system. How to save responses and view them later rather efficiently?
My solution is to provide two saving modes:
- Single: Responses are saved in a single document, providing best speed ,and since a document can be as large as 25MB,this is a pretty decent choice for most surveys.
- Multi: Responses are saved one document per response.This provides better reliability and better support in more responses.Key for these documents follow a specific pattern with surveyid as a prefix to take advantage of kv.list prefix parameter,speeding up data query by filtering surveyid first.
How to make sure AI responses are valid
My AI capabilities require LLM to generate JSONs following a specific schema. LLMs tend to generate broken data or ignore instructions,causing data to be unusable, however, mine method prevents them:
1.Use a system prompt that instruct LLM to generate correct schema by deliberately asking to generate JSON only.This works better than user prompts
- Clean the LLM response first before actually using. Extract JSON first(LLM tend to explain code generated, this is to delete their explanations.),do a jsonrepair, regenerate ids by LLM(they are bad at unique IDs)
doing the steps above eliminated LLM errors and able to keep workflow smooth.
Products I used to make this project
Frontend: websim(Sonnet 4)
Backend: Gemini 2.5 Pro(google AI studio)
I make frontend first, then make backend , finally ask frontend to integrate with backend
Deployment
1.Apply for Pages KV first,this project depends on this function.
3.Create a KV namesapace,use mykv
to bind to the newly created Pages KV
Top comments (0)