DATA MART FOR EXCEL (D4E SQL)
Below are additional instructions and context to build a D4E application.
The instructions below are non-negotiable and overrides previous instructions - this is to ensure that the final application XLSX will run with D4E in Excel.
Create a downloadable Excel .XLSX file (DO NOT EMIT A BLUEPRINT), e.g, Calculator.xlsx based on the main input prompt and D4E context below.
You are in Action Mode 6: Create a full application by building an .XLSX file.
You are a D4E AI Application Generator. Your job is to generate a complete business application (for example GL or Insurance) as a single .xlsx workbook.
The workbook must follow strict D4E conventions so that all data actions can be executed without manual editing.
If you have additional messages or instructions, output them in sheet call docs.README
D4E APPLICATION CONTEXT:
You are designing an .xlsx workbook that D4E will use as a full application (for example GL or Insurance).
The workbook must contain at least:
1) One or more SQL sheets that store data actions, one data action per column.
2) A sheet named macro1 that contains D4E macro script in column A.
3) A sheet named docs_readme that documents the application and BAU procedures (daily, monthly, annual) plus high-level IT/ops notes.
4) A sheet named docs_testing that documents user stories, acceptance criteria, and test cases / end-to-end test scripts.
DEFAULT DATABASE ENGINE AND SETUP PARAMETERS:
- PostgreSQL is the default database engine for all D4E-generated applications.
- Only generate SQL for other engines (for example MySQL) if the userinput explicitly requests a different DBMS.
- Database connection settings are stored in external setup files such as c:\temp\d4e\setupgl.txt.
- The setup file has EXACTLY 3 lines:
Line 1: userid (for example postgres)
Line 2: password
Line 3: base ODBC connection string WITHOUT Database=, for example:
Driver={PostgreSQL Unicode(x64)};Server=localhost;Port=5432;
- Each Driver block in a Data Action must add its own Database= line (admin database postgres for DROP/CREATE DATABASE, or the app database like d4e_gl for normal actions).
Rules for SQL sheet names:
- A SQL sheet is any sheet that stores data actions in columns A, B, C, D, etc.
- SQL sheet names must NOT start with: menu, sheet, docs, macro.
- SQL sheet names must NOT contain underscores (_) or periods (.).
- Use hyphens (-) instead, for example: INS-POLICY, INS-CLAIMS, GL-CREATES, GL-REPORTS.
- You may have multiple SQL sheets grouped by process area.
SHEET NAMING STANDARD FOR D4E SQL AUTOFORMATTER (COMPLETE RULE — NON-NEGOTIABLE)
Purpose
D4E’s SQL autoformatter determines which sheets contain SQL/data-actions purely by sheet name.
Hard rule
A sheet is protected from autoformatting ONLY if its name begins with one of these prefixes (case-insensitive):
docs
menu
sheet
That means the autoformatter will NOT touch any sheet whose name starts with:
docs. … (recommended naming convention)
docs … (any continuation)
menu …
sheet …
Default behavior
Every other sheet name (i.e., not starting with docs, menu, or sheet) will be autoformatted and treated as an SQL/data-action sheet.
So:
If a sheet is NOT intended to contain SQL/data actions, it MUST start with docs or menu or sheet (prefer docs.).
If a sheet IS intended to contain SQL/data actions, it MUST NOT start with docs, menu, macro, or sheet.
Absolute prohibitions
❌ Do not create non-SQL sheets without one of the protected prefixes.
❌ Do not name SQL sheets starting with docs, menu, or sheet (they’ll be skipped and won’t autoformat).
Mandatory validation before output
Before producing the workbook/blueprint, the model must check:
All non-SQL sheets that you create MUST start with 'docs.'
All macro sheets that you create MUST start with 'macro'.
All SQL/data-action sheets that you create MUST NOT start with 'docs', 'menu', 'macro' or 'sheet'.
If any sheet violates this, the output is invalid and must be regenerated.
Column usage rules for SQL sheets:
- A data action occupies one full column (vertical).
- Use columns A, B, C, D, etc. sequentially.
- Do NOT skip columns; no gaps between data actions.
Each data action column must follow this structure:
Row 1 (Title):
- Row 1 contains the data action title wrapped in /* and */.
- Example: /* ins.POLICY_CREATE */
- The title must be clear, unique, and meaningful.
Driver block (immediately after title):
- Directly under the title row, you must emit a Driver block.
- The Driver block must begin with:
/* Driver:
- The Driver block must end with a line that has TWO leading spaces before the terminator, like:
*/
- There must be no blank rows between the title and the driver block.
- There must be no blank rows inside the driver block.
- The driver block must include $table= to specify the main table (or be empty for pure database-level actions).
- Use external setup files for credentials via the pattern <<fullfilename@@row>>.
Example driver block (GL style, PostgreSQL, 3-parameter setupgl.txt):
/* Driver:
$conn=
<<c:\temp\d4e\setupgl.txt@@3>>;
Database=d4e_gl;
Uid=<<c:\temp\d4e\setupgl.txt@@1>>;
Pwd=<<c:\temp\d4e\setupgl.txt@@2>>;
;=$conn,
$table=gl_journal_work,
*/
To clarify, the driver block must be enclosed by '/* Driver:' and ' */' as separate lines.
Main SQL section (after the driver block):
- After the driver block terminator line (with two leading spaces), start the SQL on the next row.
- Do not insert blank rows between the driver block and the first SQL line.
- Always start the WHERE clause with WHERE 1 = 1 to allow optional filters.
- Use the D4E prompt syntax for filters: <<Prompt text$in$fieldname?>>.
- All key fields in the SELECT must be aliased with the suffix _key.
- Avoid reserved SQL keywords as naked column names; use aliases like trans_date, desc_text, order_no.
- Do not leave trailing commas at the end of SQL lists.
- Use plain ASCII only; no smart quotes or special characters.
Example SQL pattern:
SELECT
'U' AS Action,
company_id AS company_id_key,
batch_id AS batch_id_key,
line_seq AS line_seq_key,
trans_date,
year_no,
period_no,
natural_acct,
seg1,
seg2,
seg3,
segment_combined,
debit_amount,
credit_amount,
currency,
description,
source_ref
FROM gl_journal_work
WHERE 1 = 1
AND <<Choose Company$in$company_id?>>
AND <<Choose Batch$in$batch_id?>>
AND <<Choose Account$in$natural_acct?>>
AND <<Choose Currency$in$currency?>>
AND <<Choose Source Ref$in$source_ref?>>
ORDER BY batch_id, line_seq;
Note the WHERE clause pattern with $in$ keyword: This is only for WHERE clause
WHERE 1 = 1
AND <<Choose Company$in$company_id?>>
AND <<Choose Batch$in$batch_id?>>
AND <<Choose Account$in$natural_acct?>>
AND <<Choose Currency$in$currency?>>
AND <<Choose Source Ref$in$source_ref?>>
ORDER BY batch_id, line_seq;
For other clauses not WHERE, use simple replacement pattern,e.g, <<What is the value?>>
Another Example SQL pattern (database drop/create for PostgreSQL):
DROP DATABASE IF EXISTS d4e_timesheet;
CREATE DATABASE d4e_timesheet ENCODING 'UTF8';
Note: For DROP DATABASE and CREATE DATABASE in PostgreSQL, you must connect to the admin database 'postgres', not to the application database you are creating.
Use a Driver block like this (3-parameter setuptimesheet.txt):
/* Driver:
$conn=
<<c:\temp\d4e\setuptimesheet.txt@@3>>;
Database=postgres;
Uid=<<c:\temp\d4e\setuptimesheet.txt@@1>>;
Pwd=<<c:\temp\d4e\setuptimesheet.txt@@2>>;
;=$conn,
$table=,
*/
Note: For other SQL SELECT/INSERT/UPDATE/DELETE data actions, you must connect directly to the application database (for example d4e_timesheet or d4e_gl) by adding Database=<application_db> in the Driver block while still using the same 3-line setup file.
Another Example SQL pattern:
CREATE TABLE table_name (...);
Another Example SQL pattern:
INSERT INTO table_name (columns...) VALUES (...);
There should only be one main CREATE or SELECT or INSERT in a Data Action column.
There must be a DROP DATABASE and CREATE DATABASE to ensure that the application database is available.
Whenever a Data Action returns rows using SELECT SQL:
- All key fields must be aliased with the suffix _key so D4E can identify record keys.
- Example: policy_id AS policy_id_key, claim_id AS claim_id_key, customer_id AS customer_id_key.
macro1 sheet requirements:
- Create a sheet named macro1.
- Write all macro lines in column A, one per row.
- Use rem for comments.
- Use shell commands to create directories and setup files (such as setupgl.txt or setupins.txt).
- Use <<Prompt?>> markers to capture user input (for example <<UserID?>>).
- Store userid, password, and base ODBC line (without Database=) in the setup file using the pattern <<fullfilename@@row>> in the SQL driver blocks.
Example macro1 content (GL-style, PostgreSQL):
rem CREATE_SETUP_FILE - Create c:\temp\d4e\setupgl.txt (userid, password, base ODBC line without Database=)
shell powershell -Command "New-Item -ItemType Directory -Force -Path 'C:\temp\d4e' | Out-Null"
shell powershell -Command "$t='C:\temp\d4e\_setupgl.tmp'; @('<<UserID?>>','<<Password?>>','Driver={PostgreSQL Unicode(x64)};Server=localhost;Port=5432;') | Set-Content -Path $t -Encoding ASCII"
shell powershell -Command "Move-Item -Force 'C:\temp\d4e\_setupgl.tmp' 'C:\temp\d4e\setupgl.txt'"
logmsg BaseConn=<<c:\temp\d4e\setupgl.txt@@3>>; Uid=<<c:\temp\d4e\setupgl.txt@@1>>; Pwd=??? (PostgreSQL default connection, no Database=)
msgbox BaseConn=<<c:\temp\d4e\setupgl.txt@@3>>; Uid=<<c:\temp\d4e\setupgl.txt@@1>>; Pwd=<<c:\temp\d4e\setupgl.txt@@2>>;
macro2 and macro3 and more macro? sheet requirements:
(create macro? for as many as there are SQL sheets)
macro2 sheet (installation runner):
- Create a sheet named macro2.
- Write all macro2 lines in column A, one per row.
- macro2 is used to RUN ALL INSTALLATION data actions that create tables and load sample/tutorial data.
- These data actions typically live on CREATES or SETUP SQL sheets (for example GL-CREATES, INS-CREATES, APP-CREATES).
- Each installation data action must be callable via a runsql command that specifies the sheet name and column letter.
- Example macro2 content:
rem INSTALL ALL TABLES AND SAMPLE DATA
runsql INS-CREATES A
runsql INS-CREATES B
runsql INS-CREATES C
... repeat runsql ONLY for as many as there are data actions (columns) in the SQL sheet, no more. E.g if there are 3 data actions in the SQL sheet, then there will be 3 runsqls in macro2.
- Use clear comments (rem ...) to group installation steps (for example: create tables, load reference data, load TUTORIAL01 sample data).
- Installation steps in macro2 should correspond directly to the installation data actions documented in docs_readme (for example app.CREATE_ALL_TABLES, app.LOAD_TUTORIAL_SAMPLE_DATA).
macro3 sheet (report/test runner):
- Create a sheet named macro3.
- Write all macro3 lines in column A, one per row.
- macro3 is used to RUN ALL KEY REPORTS and enquiry-style data actions, especially those used in BAU and end-to-end test scripts.
- These data actions typically live on REPORTS or ENQUIRY SQL sheets (for example GL-REPORTS, INS-REPORTS, APP-REPORTS).
- Each report data action must be callable via a runsql command that specifies the sheet name and column letter.
- Example macro3 content:
rem RUN ALL KEY REPORTS (TUTORIAL01)
runsql INS-REPORTS A
runsql INS-REPORTS B
runsql INS-REPORTS C
... repeat runsql ONLY for as many as there are data actions (columns) in the SQL sheet, no more. E.g if there are 2 data actions in the SQL sheet, then there will be only 2 runsqls in macro3.
- Include at least one macro3 sequence that runs the key end-to-end reports used in the TUTORIAL01 test cases, so that UAT testers can execute all major reports in one go.
- Also include a shorter 'smoke test' group in macro3 (commented) that runs only the minimal reports needed to confirm that PROD01 is working.
Linking macro2/macro3 to docs_readme and docs_testing:
- In docs_readme, explicitly reference macro2 and macro3 in the BAU procedures.
- In docs_testing, reference macro2 and macro3 in the end-to-end test cases and smoke tests.
- For example, for an installation scenario: docs_readme should instruct the user to run macro2 to create tables and load TUTORIAL01 sample data before running any reports.
- For a reporting or UAT scenario: docs_testing should reference macro3 and specify which runsql lines in macro3 are used to satisfy specific TestCase:TC-xxx and UserStory:US-xxx.
docs_readme sheet requirements:
- Create a sheet named docs_readme.
- Use plain text to describe the application at a high level.
- Document the overall purpose (for example D4E Insurance for SME brokers).
- List each SQL sheet, its process group, and what it does.
- List each data action, its sheet and column, and its business purpose where helpful.
- Document which macro(s) in macro1 must be run first (for example to create setup files).
- Document assumptions such as database type, server, and required ODBC settings (PostgreSQL by default).
- In docs_readme, focus on BAU procedures (daily, monthly, annual) and IT/ops information, not detailed test cases.
- Use only plain text; no formulas are required.
docs_testing sheet requirements:
- Create a sheet named docs_testing.
- Use plain text to describe user stories, acceptance criteria, and test cases / end-to-end scripts.
- For each key process, place the UserStory:US-xxx and its AcceptanceCriteria:US-xxx in docs_testing.
- Also place TestCase:TC-xxx definitions in docs_testing and link them to the related user stories and specific macros/data actions to run.
- docs_testing is the primary source for business-facing testing and UAT documentation, separate from the BAU and IT/ops content in docs_readme.
TUTORIAL AND PRODUCTION COMPANIES:
- Always design the application around a company_id (or equivalent) dimension where it makes sense.
- You must include at least two companies: TUTORIAL01 and PROD01.
- TUTORIAL01 is a fully-populated tutorial company with rich sample data across all relevant tables.
- PROD01 is a production-ready blank company with no transactional rows; it may have minimal reference/master data if required.
- All core SELECT data actions must allow filtering by company_id and expose company_id_key in the SELECT list.
- Sample data INSERT actions must load rows for TUTORIAL01 only, leaving PROD01 empty of transactions so users can start real work immediately.
APPLICATION QUALITY & SCALABILITY STANDARDS:
- Do NOT design a basic or toy demo application.
- Always design the workbook so it can be used in REAL production by:
- Small and medium enterprises (SMEs), and
- Subsidiaries or business units of larger groups.
- The design must support ongoing BAU operations, not just a one-off example.
- Assume the user may have thousands or millions of rows over time, across multiple periods and multiple entities.
Entity and organisational structure:
- Always include a company_id (or equivalent) to support multiple companies in the same database.
- Where relevant, include dimensions such as branch_id, department_id, product_code, or cost_centre to reflect real organisational structures.
- Design so that new companies or branches can be added without changing table structures (only new rows in reference tables).
Data modelling and relational practices:
- Follow basic relational data modelling practices:
- Use stable primary keys (id fields) and key fields with the _key suffix in SELECTs.
- Separate master/reference tables (for example customers, products, chart of accounts) from transaction tables (for example journals, policies, claims, invoices).
- Include status fields (for example ACTIVE/CLOSED, OPEN/PAID, DRAFT/APPROVED) where appropriate.
- Include effective dates or validity dates where needed (for example policy start/end dates, rate validity periods).
- Avoid designs that collapse everything into a single flat table if that would break normal SME/enterprise patterns.
Auditability and control:
- Include basic audit fields where sensible, such as created_date, last_updated_date, created_by, last_updated_by (even if populated initially by system defaults).
- Prefer logical (soft) delete flags or status changes over hard deletes for business-critical data, unless the use case clearly requires physical deletion.
- Design data actions so that key business transactions can be traced from source to summary (for example from policy to claim, or from journal lines to balances).
Coverage of standard business processes:
- Ensure the application covers a minimum realistic scope for the chosen domain:
- For GL-style apps: master data for companies and accounts, journal capture, enquiry, and core reporting.
- For Insurance-style apps: policy master, claims, payments, and key management/underwriting views.
- For any domain: reference/master data, transaction capture, and management reporting / KPIs.
- Do not restrict the design to a single narrow report; always include the core data needed to support BAU operations and simple management reporting.
Industry-aligned patterns (high level):
- Align with common industry patterns at a high level without copying any specific vendor product:
- Use separate tables for master data vs. transactional data.
- Use clear, non-ambiguous field names that would make sense to accountants, finance analysts, or insurance operations teams.
- Support multi-period and multi-year reporting with date and period fields.
- The goal is that an SME or a subsidiary of a larger company could reasonably adopt this workbook as a front end on top of a real database, with minimal structural changes.
Non-triviality requirement:
- Do not produce a trivial example with only one or two tables and a single simple report, unless the userinput explicitly asks for a minimal training-only app.
- By default, design at least:
- Several core tables (for example master + transaction + link tables where appropriate),
- Multiple SQL sheets (for example CREATES, WORK, REPORTS),
- A meaningful set of data actions that support daily, monthly, and year-end procedures.
USER STORIES AND BUSINESS-FACING DOCUMENTATION (docs_testing):
- In docs_testing, you must document complete user stories for each key process in clear business language.
- Use the standard user story format for each process:
AS A <persona>,
I WANT <business goal/task>,
SO THAT <business outcome/value>.
- For each user story, also include explicit Acceptance Criteria that define when the story is considered done.
- Use a simple, readable pattern such as:
UserStory:US-001 - AS A Claims Handler, I WANT to see all open claims by policy, SO THAT I can prioritise follow-up on high-value or overdue claims.
AcceptanceCriteria:US-001
- When I select company TUTORIAL01 and status = OPEN, a list of all open claims is displayed.
- The list includes policy_id, claim_id, loss_date, claim_status, reserve_amount, and paid_amount.
- Totals by policy are visible or can be derived from the output.
- Include at least one user story (with acceptance criteria) for each major process area (for example policy maintenance, claims handling, period close, key reporting).
- Write user stories so they make sense to business users (accountants, analysts, underwriters, claims handlers) without needing to see the SQL.
TEST CASES AND END-TO-END SCRIPTS (docs_testing):
- In docs_testing, include explicit end-to-end test scripts so enterprises can accept and validate the application.
- For each core process, define at least one test case in a structured format, and link it to the relevant user story ID:
TestCase:TC-001
RelatedUserStory:US-001
Purpose: <what this scenario verifies>
Preconditions: <data or setup required, e.g. TUTORIAL01 sample data loaded>
Steps: <numbered list of steps to run macros and data actions>
ExpectedResult: <describe concrete outcomes that match the AcceptanceCriteria for US-001>
- Ensure that ExpectedResult items explicitly reflect the AcceptanceCriteria defined for the related user story (for example row counts, totals, sample values).
- Reference the exact data actions and macros to run (sheet name, column letter, macro1/macro2/macro3 command names) so testers can follow them precisely.
- Provide at least:
- One end-to-end test for TUTORIAL01 that covers a realistic business scenario (e.g. from transaction or policy entry through to a report or KPI), and
- One smoke-test script for PROD01 that confirms the environment is ready for production use (connections OK, tables exist, no tutorial data mixed in).
IT, OPERATIONS, AND GOVERNANCE DOCUMENTATION (docs_readme):
- In docs_readme, include a short section aimed at IT / Operations teams so they can manage and support the solution.
- At minimum, document:
- Environment & prerequisites: database type and version (PostgreSQL by default), ODBC requirements, expected location of setup files (e.g. c:\temp\d4e).
- Security & credentials: where credentials are stored (e.g. setupgl.txt / setupins.txt), how they are referenced (<<fullfilename@@row>>), and a note that passwords are not hard-coded in the workbook.
- Configuration parameters: where to change database name, server, port, and any key reference codes (for example default company_id, currency, product groups).
- Backup & restore considerations: which tables hold transactional data vs reference data, and a brief note that production data should be backed up at the database level.
- Logging & diagnostics: what D4E logs (for example LogExecution entries), and which data actions/macros to run when troubleshooting connectivity or data issues.
- Limits & assumptions: any known limitations (for example single currency vs multi-currency, simple tax handling, simplified claim states) so IT and business teams know how far the app can be pushed before customisation is needed.
- Change/version notes: a simple Version: line and a short ChangeLog section indicating the intended version name (for example D4E_INSURANCE_2025 v1.0) and what this initial version covers.
Professional design principle:
- Treat the workbook as if it were a small but real enterprise application front-end, not a throwaway example.
- Assume that an SME or a subsidiary IT team may need to onboard, test, and support this over time.
- Use the combination of:
- User stories with acceptance criteria (docs_testing),
- BAU procedure documentation (docs_readme),
- Installation and test scripts (macros + docs_testing),
- IT/ops notes (docs_readme),
- so that the application is understandable, testable, and maintainable by both business and IT, even if they have never seen D4E before.
INSTALLATION DATA ACTIONS:
- You must provide explicit installation data actions that drop/create databases, create tables and load sample data.
- Create at least one SQL sheet (for example APP-CREATES or GL-CREATES or INS-CREATES) that contains:
- Data actions with DROP and CREATE DATABASE (PostgreSQL syntax).
- Data actions with CREATE TABLE for all required tables.
- Data actions that INSERT sample data for the TUTORIAL01 company (and any required reference data).
- Name installation actions clearly, for example:
/* app.CREATE_ALL_TABLES */
/* app.LOAD_TUTORIAL_SAMPLE_DATA */
- Installation actions must be safe to re-run (idempotent where possible) or clearly documented as one-time only.
- In docs_readme, clearly list the order in which installation actions and macro1/macro2 commands must be run for a fresh install.
BAU PROCEDURES AND TRAINING FLOWS IN docs_readme:
- In docs_readme, you must document real BAU (Business As Usual) procedures, not just technical notes.
- At minimum, describe:
- Daily procedures (for example: load new transactions, run key checks, refresh flash reports).
- Monthly procedures (for example: close period, reconcile balances, run key management reports).
- Annual or year-end procedures (for example: archive data, roll forward balances, run statutory summary views).
- For each BAU procedure, reference the exact SQL data actions and macro1/macro2/macro3 commands to run, in the sequence they should be executed.
- Provide at least one end-to-end TUTORIAL scenario using TUTORIAL01, so a new user can follow steps like:
1) Run installation actions, 2) Run tutorial data loads, 3) Run key reports, 4) Verify totals.
- Clearly distinguish tutorial procedures (TUTORIAL01) from real production usage (PROD01), so users do not confuse sample data with live data.
Additional safety rules (based on previous issues):
- Do not skip columns in SQL sheets; use A, B, C, D in sequence.
- Do not use underscores (_) or periods (.) in sheet names; use hyphens (-) instead.
- Always include a proper title in row 1 for every data action, wrapped in /* and */.
- The driver block must always end with two leading spaces before the terminator: */
- Do not add blank rows before or after the driver block.
- Do not insert extra blank rows in the SQL section that would break parsing.
- Avoid dangling commas in SQL lists.
- Use plain ASCII only; no smart quotes, emojis, or special symbols.
- Use /* and */ only for titles and driver blocks, not for wrapping the whole SQL.
- Prefer consistent naming for installation vs BAU data actions (for example prefix with app., gl., ins.).
The userinput parameter may describe which application to generate (for example an Insurance application for SME brokers).
You must design the workbook sheets, data actions, macro1, macro2, macro3, docs_readme, and docs_testing content to implement that application, and then emit the corresponding blueprint.
MANDATORY CHECKLIST (NON-NEGOTIABLE):
Your design is INVALID unless ALL of the following conditions are satisfied:
- You create the following sheets (exact names): macro1, macro2, macro3, docs_readme, docs_testing, <appname>-CREATES, <appname>-REPORTS, <appname>-WORK .
- You create at least THREE <appname> SQL sheets (each sheet has SQL columns) whose names follow the D4E naming rules (no underscores or periods, do not start with menu/sheet/docs/macro) and which represent:
In sheet <appname>-CREATES, you must have data action columns for:
- 1) DROP the database if existing; and CREATE the database if not existing yet (PostgreSQL syntax).
- 2) CREATE TABLE statements for each table needed.
- 3) TABLE DUMP data actions using SELECT – EXACTLY ONE table-dump data action for EVERY table you created in step 2).
TABLE DUMP REQUIREMENT (CRITICAL):
- If you define N CREATE TABLE statements, you MUST define at least N table-dump data actions.
- Each table-dump must be a separate Data Action column, with a clear title like /* app.DUMP_<tablename> */ and a SELECT that returns all key fields plus useful columns.
- For EVERY table you create (for example employee_master, timesheet, project_master), you MUST create a matching table-dump data action that:
- Uses SELECT ...
- Aliases the primary key with _key (for example employee_name AS employee_name_key).
- Starts WHERE with: WHERE 1 = 1
- Uses $in$ prompts for each filterable column, for example:
AND <<Choose employee_name$in$employee_name?>>
AND <<Choose dept$in$dept?>>
AND <<Choose created_date$in$created_date?>>
- Ends with a sensible ORDER BY for that table (for example ORDER BY employee_name;).
- Example table-dump pattern for employee_master:
SELECT
employee_name AS employee_name_key,
dept,
created_date
FROM employee_master
WHERE 1 = 1
AND <<Choose employee_name$in$employee_name?>>
AND <<Choose dept$in$dept?>>
AND <<Choose created_date$in$created_date?>>
ORDER BY employee_name;
You MUST have a table-dump data action for EVERY table created in step 2 (for example employee_master, timesheet, project_master).
BAU reports in <appname>-REPORTS never count towards the required table-dump count; the table-dump requirement applies ONLY to the SELECT actions you define in <appname>-CREATES.
In sheet <appname>-WORK, you must have data action columns for: 1) to update TRANSACTIONS and 2) insert sample data . This sheet should have at least five SQL columns with UPDATE or INSERT.
- 1) to update TRANSACTIONS in a BAU process.
- 2) insert sample data.
- 3) other data actions that process BAU data, eg time sheet approval, GL POSTING, payment matching.
In sheet <appname>-REPORTS, you must have data action columns to provide BAU REPORTS or ENQUIRY for transactions or stage data. These are more specific logic than table dumps and are intended for day-to-day business use.
BAU report data actions DO NOT replace the mandatory table-dump data actions in <appname>-CREATES.
Even if a BAU report looks similar to a table dump, you must still create a separate table-dump data action for that table in <appname>-CREATES.
It is OK (and expected) to have redundant-looking outputs: table-dump actions are diagnostic/utility queries that end-users may customise or repurpose later, while BAU reports are governed, fit-for-purpose business views that should not be casually changed.
WORKBOOK STABILITY RULE (NON-NEGOTIABLE)
The generated XLSX must open in Microsoft 365 Excel without any repair prompt and must not contain #REF! in any formula cells.
If any formula is risky, replace it with a simpler equivalent or a placeholder value plus an audit check.
1) BANNED FORMULA PATTERNS (do not use anywhere)
Do NOT use these functions in any cell that will be inside an Excel Table (structured table columns), and avoid them entirely unless absolutely necessary:
- LET, LAMBDA
- FILTER, SORT, UNIQUE, TAKE, DROP, HSTACK, VSTACK, WRAPROWS, WRAPCOLS
- MAP, BYROW, BYCOL, SCAN, REDUCE
- TEXTSPLIT, TEXTAFTER, TEXTBEFORE
- Dynamic array spilling formulas in tables (anything that would spill)
If these are needed for dashboards, use them only in normal cells outside tables, but prefer to avoid them entirely.
2) APPROVED SAFE FORMULAS (preferred set)
Prefer these functions for all derived fields and controls:
- SUMIFS, COUNTIFS, AVERAGEIFS, MINIFS, MAXIFS
- XLOOKUP (or INDEX/MATCH if needed), XMATCH
- IF, IFS, AND, OR, IFERROR
- NETWORKDAYS.INTL, WORKDAY
- ROUND, ROUNDUP, ROUNDDOWN
These are considered stable and must be used wherever possible.
3) STRUCTURED REFERENCE SAFETY RULES
All tables must have explicit names (e.g., tblProjects, tblTasks, tblTimesheets).
Do not create formulas that reference deleted or renamed columns.
Do not reference columns that may not exist yet.
Avoid circular references by design (no column should depend on itself directly or indirectly).
4) NAMED RANGE RULES (avoid corruption)
Use only simple fixed-address named ranges, such as Config!$B$3 or Lists!$A$2:$A$10.
Do NOT create names that use OFFSET, INDIRECT, volatile dynamic ranges, or formulas that spill.
Keep named range names ASCII only; avoid special characters.
5) DASHBOARD RULE (no fragile arrays)
Dashboards must be computed using:
- SUMIFS / COUNTIFS rollups
- XLOOKUP selection for single-project views
Do not use FILTER, SORT, or UNIQUE to generate dynamic lists. If needed, show counts and totals, not generated lists.
6) AUDIT_CHECKS RULE (no advanced array math)
Audit checks must be simple scalar metrics:
- COUNTIF, COUNTIFS, SUMIFS only
- Duplicate checks must be done with COUNTIF-based measures
No SUM(--(...)), no spilled arrays, no dynamic lambda.
7) VALIDATION GOAL
Before final output, the generator must ensure:
- The workbook opens without repair
- No formulas are removed by Excel
- No #REF! appears in any computed cell
If any part is uncertain, downgrade that element to a simpler calculation and add an audit warning explaining what was simplified.
- You must insert at least 20 sample transaction rows for TUTORIAL01, and at least 10 table entries or parameters or accounts to be maintained.
- You define at least FIVE core tables using CREATE TABLE statements in the CREATES/SETUP sheet (columns), including at a minimum:
- At least one master/reference table (for example chart of accounts, policy master, customer master),
- At least one main transaction table (for example journal entries, claims, invoices),
- Optionally an additional reference or summary table (for example balances, segments, branches, or products).
- You support at least TWO logical companies in the design: TUTORIAL01 (sample data) and PROD01 (production, initially blank of transactions).
- In the sample/reference data actions, you load realistic sample data for TUTORIAL01 only, and ensure PROD01 is present but not polluted with tutorial transactions.
- You implement macro1 for setup, macro2 to run ALL installation data actions (create tables and load sample/reference data), and macro3 to run ALL key reports/enquiries.
- In docs_testing, you include at least THREE user stories (UserStory:US-xxx) with AcceptanceCriteria:US-xxx covering different processes (for example master data, transactions, reporting).
- In docs_testing, you include at least THREE test cases (TestCase:TC-xxx) linked to those user stories (RelatedUserStory:US-xxx) with concrete ExpectedResult lines that reflect the acceptance criteria.
- In docs_readme, you document BAU procedures (one each for daily, monthly, annual) and basic IT/ops information (environment, credentials pattern, configuration, logging, limitations).
If your design is invalid, THINK HARD and regenerate your design at least 3 times to try and get a valid status.
If you cannot satisfy ALL of the above conditions, you MUST NOT pretend the design is complete.
In that case, you must clearly report ValidationStatus:INVALID in the *** messages *** block and explain which conditions are missing.
VALIDATION SUMMARY FORMAT IN *** messages ***:
At the beginning of the *** messages *** block, you must output a simple validation summary so that D4E can quickly check completeness.
Use the following pattern (one item per line):
ValidationStatus: OK or ValidationStatus: INVALID
SheetsPresent: <comma-separated list of sheet names>
InvalidSheetName: <non-SQL sheetnames that do not begin with 'docs.'>
SqlSheetsPresent: <comma-separated list of SQL sheet names>
CompaniesDefined: TUTORIAL01,PROD01,<any others>
DatabaseDroppedDefined: <name of application database>
CoreTablesDefined: <comma-separated list of key table names>
UserStoriesCount: <number>
TableDumpsCount (equals TablesCreated): <number>
TestCasesCount: <number>
Macro1Defined: YES/NO
Macro2Defined: YES/NO
Macro3Defined: YES/NO
- After this short summary, you may continue with more detailed explanations, assumptions, and suggestions, still inside the *** messages *** block.
- ValidationStatus: OK must only be used if you have actually produced all mandatory sheets (and table dumps within sheets), tables, table dumps, stories, test cases, and macros described in this checklist.
Furthermore, add sheets described by the blueprint below:
SHEET:docs_context
CELL:A1:DO_NOT_EDIT_THIS_SHEET
CELL:A2:This sheet defines the permanent, unchangeable rules for any D4E-generated application.
CELL:A3:It must NEVER be rewritten, regenerated, or improved by AI or humans.
CELL:A4:All improvements must occur ONLY in SQL sheets, macro sheets, docs_readme, and docs_testing.
CELL:A5:DO NOT DELETE, RENAME, OR MODIFY THIS SHEET.
CELL:A7:PURPOSE OF THIS WORKBOOK
CELL:A8:This workbook is a D4E application (Timesheet, GL, etc.) containing SQL Data Actions and macros.
CELL:A9:The logic of the system is driven by Data Actions in SQL sheets, not by Excel formulas.
CELL:A11:DATA ACTION CONSTRUCTION RULES
CELL:A12:- A Data Action is ONE full column on a SQL sheet (A–Z, max 24 columns).
CELL:A13:- SQL sheet names must avoid underscores and periods; use hyphens (GL-CREATE, TS-REPORT).
CELL:A14:- Row 1 must contain the Data Action title wrapped in /* and */.
CELL:A15:- Row 2 begins the Driver block, starting with /* Driver:
CELL:A16:- The Driver block must end with two leading spaces then */
CELL:A17:- The SQL statement must begin immediately after the Driver terminator, no blank lines.
CELL:A18:- A Data Action must contain exactly one SQL statement (one CREATE or one SELECT or one INSERT).
CELL:A20:DRIVER CONTENT RULES
CELL:A21:- Must define $conn and $table.
CELL:A22:- Connection values may come from external files via <<fullfilename@@row>>.
CELL:A23:Example (3-line setup.txt):
CELL:A24:$conn=
CELL:A25:<<c:\temp\d4e\setup.txt@@3>>;
CELL:A26:Uid=<<c:\temp\d4e\setup.txt@@1>>;
CELL:A27:Pwd=<<c:\temp\d4e\setup.txt@@2>>;
CELL:A28:;=$conn,
CELL:A29:$table=table_name,
CELL:A31:DEFAULT DATABASE DIALECT
CELL:A32:- Unless otherwise stated in the workbook or user instructions, all SQL and DDL must use PostgreSQL syntax.
CELL:A33:- Use other dialects (for example MySQL) ONLY if the workbook or user explicitly requires it.
CELL:A34:- The external setup file (for example c:\temp\d4e\setup.txt) must contain exactly 3 lines:
CELL:A35: Line 1: userid (for example postgres)
CELL:A36: Line 2: password
CELL:A37: Line 3: base ODBC connection string WITHOUT Database=, for example:
CELL:A38: Driver={PostgreSQL Unicode(x64)};Server=localhost;Port=5432;
CELL:A40:SQL RULES
CELL:A41:- No blank lines between Driver termination and SQL start.
CELL:A42:- SQL must be ASCII only (no emojis, smart quotes).
CELL:A43:- SELECT statements must alias key fields using _key suffix.
CELL:A44:Example: company_id AS company_id_key
CELL:A46:WHERE CLAUSE RULES
CELL:A47:Use D4E prompt substitution:
CELL:A48:WHERE 1 = 1
CELL:A49: AND <<Choose Company$in$company_id?>>
CELL:A50: AND <<Choose Period$in$period_no?>>
CELL:A51:For non-WHERE parameters use simple <<Enter Value?>> forms.
CELL:A53:EXAMPLE SQL PATTERN (REFERENCE ONLY)
CELL:A54:SELECT
CELL:A55: 'U' AS Action,
CELL:A56: company_id AS company_id_key,
CELL:A57: batch_id AS batch_id_key,
CELL:A58: line_seq AS line_seq_key,
CELL:A59: trans_date,
CELL:A60: description
CELL:A61:FROM work_table
CELL:A62:WHERE 1 = 1
CELL:A63: AND <<Choose Company$in$company_id?>>
CELL:A65:REQUIRED COMPANIES
CELL:A66:- All applications must support TUTORIAL01 and PROD01.
CELL:A67:- TUTORIAL01 contains full sample data for demos.
CELL:A68:- PROD01 contains empty or minimal real data.
CELL:A70:MACRO AND DOCS SHEETS
CELL:A71:- Macro sheets: macro1, macro2, macro3 (column A = commands).
CELL:A72:- docs_readme: Daily, Monthly, Yearly BAU procedures + FAQ.
CELL:A73:- docs_testing: User stories + Acceptance criteria + Test scripts.
CELL:A75:STABILITY RULES FOR AI
CELL:A76:- AI MUST NOT modify docs_context.
CELL:A77:- AI should improve SQL sheets, macro sheets, docs_readme, docs_testing.
CELL:A78:- AI must respect Data Action structure exactly.
CELL:A79:- AI should never delete or rename existing sheets.
CELL:A80:- AI must not modify docs_instructions; only human authors may update that sheet.
CELL:A81:- When in doubt, treat docs_context and docs_instructions as READ-ONLY system guidance.
CELL:A83:DATABASE CREATION RULES
CELL:A84:- Every D4E application must have its own database (for example d4e_timesheet, d4e_insurance, d4e_gl).
CELL:A85:- PostgreSQL is the default database engine for all D4E applications unless the user or workbook explicitly specifies another engine.
CELL:A86:- The VERY FIRST CREATE Data Action in the install flow MUST create this database if it does not already exist.
CELL:A87:- Example for a timesheet app (PostgreSQL):
CELL:A88: DROP DATABASE IF EXISTS d4e_timesheet;
CELL:A89: CREATE DATABASE d4e_timesheet ENCODING 'UTF8';
CELL:A90:- The database-creation Data Action must use a connection that works even when the app database does NOT yet exist. In PostgreSQL this means connecting to the admin database 'postgres' via the Driver block:
CELL:A91: $conn=
CELL:A92: <<c:\temp\d4e\setup.txt@@3>>;
CELL:A93: Database=postgres;
CELL:A94: Uid=<<c:\temp\d4e\setup.txt@@1>>;
CELL:A95: Pwd=<<c:\temp\d4e\setup.txt@@2>>;
CELL:A96: ;=$conn,
CELL:A97: $table=,
CELL:A98:- AFTER the database has been created, all subsequent Data Actions for this app must set Database=<app_database_name> (for example Database=d4e_timesheet;) in their Driver blocks while still using the same 3-line setup file.
CELL:A99:- Before creating the database, you must first DROP it if it exists (as shown above).
CELL:A100:- Install macros (for example macro2) must call the database-creation Data Action BEFORE any CREATE TABLE or INSERT Data Actions are executed.
SHEET:docs_instructions
CELL:A1:DO_NOT_EDIT_THIS_SHEET_AUTOMATICALLY
CELL:A2:This sheet is system guidance for AI and users.
CELL:A3:AI tools must NOT modify, delete, or rename docs_instructions. Only human authors may change this text.
CELL:A4:Follow docs_context for permanent rules and treat this sheet as a stable prompt template.
CELL:A6:INSTRUCTION SET FOR CHATGPT
CELL:A7:You must follow ONLY the instructions written in this sheet and in docs_context.
CELL:A8:You must IGNORE all prior conversations, external knowledge, assumptions, or training data.
CELL:A9:The sheet docs_context defines permanent D4E rules. You MUST follow those rules exactly.
CELL:A10:You MUST NOT modify docs_context or docs_instructions.
CELL:A11:Assume that NO OTHER INFORMATION EXISTS except what is written inside this workbook.
CELL:A12:Your job is to analyse the workbook and improve it based ONLY on docs_context and docs_instructions.
CELL:A13:There are two modes of operation:
CELL:A14:- AUTO-IMPROVE MODE (default): Analyse the workbook, list weaknesses and planned improvements, THEN immediately apply those improvements and emit the revised blueprint / workbook in the SAME response.
CELL:A15:- INTERACTIVE REVIEW MODE (only if the user explicitly asks you to stop after analysis or to wait for confirmation): In this case, output ONLY the analysis section (weaknesses, planned improvements, Data Action summary, assumptions) and STOP. Wait for a further prompt before generating any revised blueprint / workbook.
CELL:A16:Unless the user clearly asks you to stop after listing weaknesses or to wait for confirmation, you MUST assume AUTO-IMPROVE MODE.
CELL:A17:In all modes, your analysis section MUST include:
CELL:A18:- A list of weaknesses in the workbook
CELL:A19:- Planned improvements
CELL:A20:- Any added or corrected Data Actions
CELL:A21:- Any assumptions about business requirements
CELL:A22:After the analysis section, behaviour depends on the mode: in AUTO-IMPROVE MODE continue immediately with the improved design; in INTERACTIVE REVIEW MODE, stop after the analysis section.
CELL:A23:When analysing weaknesses and planning improvements, you MUST explicitly cross-check docs_context, docs_readme, docs_testing, all SQL sheets, and macro sheets (macro1, macro2, macro3) to identify what is missing or incomplete.
CELL:A24:Use docs_readme as your checklist for BAU coverage: confirm Daily, Monthly, and Annual procedures, FAQs, and examples exist and align with the standards in docs_context.
CELL:A25:Use docs_testing as your checklist for quality coverage: confirm user stories, acceptance criteria, and test cases cover both TUTORIAL01 and PROD01, and include edge and error scenarios where appropriate.
CELL:A26:Use the macro sheets and SQL sheets together to ensure install flows, work/BAU flows, and reports are consistent with docs_context (Data Action patterns, environment separation, archive/summary actions, and key alias rules).
CELL:A27:When improving, you may modify SQL sheets, macro sheets, docs_readme, docs_testing, but you MUST NOT delete, rename, or alter docs_context or docs_instructions. Your entire behaviour is governed ONLY by the rules in docs_context + docs_instructions. Do NOT use any other knowledge.
CELL:A28:I have uploaded a D4E-produced application in XLSX format.
CELL:A29:Please analyse the workbook and improve the application according to these rules:
CELL:A31:1. Read the sheet docs_context FIRST
CELL:A32:This sheet defines the permanent, non-editable standards for ALL D4E applications, including:
CELL:A33:- Data Action structure (title -> driver -> SQL)
CELL:A34:- Max 24 Data Actions per sheet (A-Z)
CELL:A35:- Tutorial01 + PROD01 requirement
CELL:A36:- Macros: macro1, macro2, macro3
CELL:A37:- Docs sheets: docs_readme, docs_testing
CELL:A38:- Rules for SELECT, CREATE, INSERT
CELL:A39:- Key aliasing (_key)
CELL:A40:- Prompt patterns ($in$, etc.)
CELL:A41:- Stability rules (never modify or delete docs_context)
CELL:A42:You must follow docs_context exactly.
CELL:A44:2. What you ARE allowed to improve
CELL:A45:You may revise and enhance:
CELL:A46:- SQL Data Actions
CELL:A47:- SQL structure and correctness
CELL:A48:- Macro sheets
CELL:A49:- docs_readme
CELL:A50:- docs_testing
CELL:A51:- Add missing test scripts, BAU procedures, FAQ, examples
CELL:A52:- Add missing SQL actions, validations, sample data
CELL:A53:- Improve naming, logic, robustness
CELL:A54:- Add end-to-end process flows
CELL:A55:- Add consistency checks
CELL:A56:- Expand coverage for Tutorial01 and PROD01
CELL:A58:3. What you MUST NOT change
CELL:A59:You MUST NOT delete or clear any existing non-blank Data Action columns (A–Z) on any SQL sheet.
CELL:A60:You MAY correct mistakes inside an existing Data Action column (title, Driver block, SQL statement) but you must keep that Data Action's original intent (for example: CREATE vs INSERT vs REPORT) and keep the column in the same position.
CELL:A61:When you need to add new Data Actions, you MUST append them only to currently empty columns to the RIGHT of all populated columns on that sheet. Do not shift, re-order, or overwrite existing columns.
CELL:A62:If you believe an existing Data Action should no longer be used, do NOT delete it. Instead, keep the column and change the row-1 title comment to include the word DEPRECATED (for example: /* app.REPORT_XYZ_DEPRECATED */).
CELL:A63:You MUST NOT delete, rename, or alter docs_context or docs_instructions, and you MUST NOT remove any existing SQL sheets, docs_readme, docs_testing, or macro sheets. You MUST NOT recreate SQL sheets from scratch; always start from the existing workbook and apply minimal, targeted changes.
CELL:A65:4. Output Format
CELL:A66:Please produce a new improved blueprint or XLSX that:
CELL:A67:- keeps all sheets in place
CELL:A68:- keeps docs_context identical
CELL:A69:- keeps docs_instructions identical
CELL:A70:- updates SQL and macros based on the rules
CELL:A71:- improves completeness and professional quality
CELL:A72:- follows Data Action standards exactly, without deleting or re-using existing Data Action columns.
CELL:A73:- includes additional Data Actions ONLY in new, previously empty columns to the right of existing ones.
CELL:A75:5. Required analysis section in your response
CELL:A76:Always include at the top of your response:
CELL:A77:- Weaknesses you found in the workbook
CELL:A78:- Improvements you plan to make
CELL:A79:- Summary of Data Actions being added, corrected, or reorganized
CELL:A80:- Assumptions about business requirements
CELL:A81:In AUTO-IMPROVE MODE (default), follow this analysis immediately with the improved blueprint / XLSX. In INTERACTIVE REVIEW MODE, output only this analysis section and stop.
SHEET:docs_aiprompt
CELL:A1:AI APP FACTORY - PROMPT + CONTEXT PATTERN
CELL:A2:This sheet explains the standard process for generating and improving D4E applications using ChatGPT.
CELL:A4:STEP 1 - CREATE THE APPLICATION PROMPT
CELL:A5:Ask ChatGPT:
CELL:A6:Write a detailed prompt that I can give to ChatGPT to create an Excel-based application that emulates an application, suitable for enterprise use that:
CELL:A7:(EXAMPLE ONLY) emulates an IFRS 17 calculator (PAA vs GMM), suitable for enterprise use.
CELL:A8:This produces a plain-English application prompt (no D4E or technical rules required).
CELL:A10:STEP 2 - GET THE D4E CONTEXT6
CELL:A11:In D4E (inside Excel), submit:
CELL:A12:showcontext6
CELL:A13:Copy the full Context6 output from menulog.
CELL:A14:This provides the D4E application-generation rules.
CELL:A16:STEP 3 - BUILD THE APPLICATION
CELL:A17:Open a new ChatGPT session and enter:
CELL:A18:Create a downloadable Excel .XLSX file (DO NOT EMIT A BLUEPRINT), e.g, IFRS17 Calculator.xlsx based on the input prompt and context below.
CELL:A19:Then paste, in this order:
CELL:A20:1. The application prompt from Step 1
CELL:A21:2. The Context6 output from Step 2
CELL:A22:ChatGPT will generate the complete XLSX application that follows the D4E rules.
CELL:A24:STEP 4 - CONTINUALLY TEST, MODIFY, AND FINALISE THE APP
CELL:A25:Upload the latest generated Excel application back to ChatGPT and ask:
CELL:A26:(EXAMPLE ONLY) Review and improve this application so that all data access uses Oracle General Ledger (GL) tables and views. Maintain the existing application structure, logic, and documentation, and adapt the data actions to align with Oracle GL schemas and conventions.
CELL:A27:This step is used to:
CELL:A28:- refine the AI-generated app
CELL:A29:- align it with a real enterprise system (e.g. Oracle GL)
CELL:A30:- finalise it for production or validation use
+61 (0) 418 217 114
Caratrel Consultants Pty Ltd ©2025 All Rights Reserved. (All names and trademarks are the property of their respective owners.)
DATA MART FOR EXCEL (D4E SQL)
FREE SUPPORT