-
-
Notifications
You must be signed in to change notification settings - Fork 113
chore: merge from dev #1017
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: merge from dev #1017
Conversation
… resolution scopes (#992)
WalkthroughWalkthroughThe recent updates across various packages bring enhancements and fixes to schema validation, REPL initialization, plugin mechanisms, and command execution. Notably, there's improved support for complex attributes in IDEs, refined validation logic in schema validators, and a more efficient way to execute Prisma commands. These changes aim to streamline development workflows, enhance code reliability, and offer better integration capabilities with external services. Changes
Related issues
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 6
Configuration used: CodeRabbit UI
Files selected for processing (14)
- packages/ide/jetbrains/CHANGELOG.md (1 hunks)
- packages/plugins/tanstack-query/src/runtime/vue.ts (2 hunks)
- packages/schema/src/cli/actions/repl.ts (2 hunks)
- packages/schema/src/language-server/validator/datamodel-validator.ts (2 hunks)
- packages/schema/src/language-server/validator/schema-validator.ts (1 hunks)
- packages/schema/src/plugins/prisma/schema-generator.ts (3 hunks)
- packages/schema/src/res/stdlib.zmodel (2 hunks)
- packages/schema/src/utils/exec-utils.ts (1 hunks)
- packages/schema/tests/schema/stdlib.test.ts (1 hunks)
- packages/schema/tests/schema/validation/attribute-validation.test.ts (2 hunks)
- packages/schema/tests/schema/validation/datamodel-validation.test.ts (17 hunks)
- packages/schema/tests/schema/validation/datasource-validation.test.ts (2 hunks)
- packages/schema/tests/schema/validation/schema-validation.test.ts (1 hunks)
- packages/schema/tests/utils.ts (4 hunks)
Additional comments: 14
packages/ide/jetbrains/CHANGELOG.md (1)
- 4-9: The changelog updates clearly outline the enhancements and fixes made to the JetBrains IDE support, including the complex usage of the
@@index
attribute, ZModel validation issue fixes related to model inheritance, and the introduction of auto-completion within attributes. These changes align well with the PR objectives of improving IDE support and enhancing the developer experience.packages/schema/tests/schema/stdlib.test.ts (1)
- 27-27: The modification to throw a
SchemaLoadingError
withvalidationErrors
directly in the standard library tests is a good practice. It simplifies error handling by leveraging the structured error information, making the test code more readable and maintainable.packages/schema/tests/schema/validation/schema-validation.test.ts (1)
- 42-54: The addition of the test case to check for errors when importing a non-existing model file with an extension is a valuable enhancement to the schema validation tests. It ensures that import errors are correctly identified and reported, improving the robustness of the schema validation process.
packages/schema/tests/schema/validation/datasource-validation.test.ts (1)
- 5-18: The refactoring of the datasource validation tests to use the
safelyLoadModel
function for error handling is a positive change. It centralizes error handling logic, making the test code more readable and maintainable. The updated expectations in the test cases align well with this new structure, ensuring the tests' effectiveness.packages/schema/tests/utils.ts (1)
- 77-95: > 📝 NOTE
This review was outside the diff hunks and was mapped to the diff hunk with the greatest overlap. Original lines [10-95]
The enhancements made to the
utils.ts
file, including the introduction of theErrorish
type, modifications to theSchemaLoadingError
class, and the addition of thesafelyLoadModel
function, significantly improve the testing utilities. These changes offer more flexible and clear error handling, as well as structured approaches to loading models and handling errors in tests. TheerrorLike
constant function further aids in creating error-like objects for testing purposes, enhancing the robustness and readability of the testing utilities.packages/plugins/tanstack-query/src/runtime/vue.ts (1)
- 64-64: The updates to the
useModelQuery
anduseInfiniteModelQuery
functions, excluding thequeryKey
property from theoptions
parameter usingOmit
, are well-considered improvements. These changes streamline the API and enhance type safety by ensuring that thequeryKey
is managed internally, preventing potential conflicts or misuse. This approach aligns with best practices for API design and usability.Also applies to: 90-90
packages/schema/src/language-server/validator/datamodel-validator.ts (1)
- 65-71: The updated validation logic for the
@id
attribute now includes checks for array types and ensures that the field is of a scalar or enum type. This is a significant improvement in ensuring data integrity and consistency. However, it's important to ensure that theisEnum
function correctly handles all potential enum cases, including imported enums and locally defined enums within the same schema.packages/schema/tests/schema/validation/datamodel-validation.test.ts (3)
- 1-1: The introduction of
safelyLoadModel
anderrorLike
functions is a positive change, enhancing the readability and maintainability of the test cases by providing a more descriptive and consistent way to handle and assert errors.- 12-21: This test case for duplicated fields effectively demonstrates the use of
safelyLoadModel
anderrorLike
to assert the presence of a specific error message. It's a good practice to ensure that the schema validation correctly identifies and reports duplicated field names.- 125-331: The comprehensive addition of test cases for the
@id
field validation covers a wide range of scenarios, including checks for unique criteria, scalar and enum types, optional fields, list fields, and JSON fields. These tests significantly improve the coverage and robustness of the data model validation process. It's important to ensure that all edge cases are considered, especially for complex scenarios involving relations and abstract models.packages/schema/src/res/stdlib.zmodel (2)
- 393-393: The addition of the
map: String?
parameter to the@@fulltext
attribute enhances flexibility by allowing users to specify a custom name for the fulltext index in the database. This change aligns with the existing pattern of providing amap
parameter for other attributes like@id
,@unique
, and@@id
. It's a positive enhancement for users who need to adhere to specific naming conventions in their database schemas.- 482-482: The addition of the
_ x: Int?
parameter to the@db.VarBinary
attribute introduces the ability to specify the length of the VarBinary field in the database. This change is consistent with other database type modifiers in the schema, such as@db.String
,@db.Char
, and@db.VarChar
, which also accept a length parameter. It's a valuable addition for users who need to define binary fields with specific lengths in their database schemas.packages/schema/src/plugins/prisma/schema-generator.ts (2)
- 145-145: The use of
execPackage
for executing theprisma format
command is a good improvement overexecSync
, enhancing flexibility and efficiency. Ensure that theoutFile
variable is properly sanitized or validated elsewhere to prevent command injection vulnerabilities.- 154-160: The construction of the
generateCmd
string for theprisma generate
command is well-handled, especially with the inclusion ofoptions.generateArgs
. However, it's crucial to ensure thatoutFile
andoptions.generateArgs
are properly sanitized or validated to prevent command injection vulnerabilities.
await loadModel(` | ||
${ prelude } | ||
model A { | ||
id String @id | ||
x String | ||
y String | ||
z String | ||
@@fulltext([x, y, z]) | ||
} | ||
|
||
model B { | ||
id String @id | ||
x String | ||
y String | ||
z String | ||
@@fulltext([x, y, z], map: "n") | ||
} | ||
`); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The addition of models A
and B
with @@fulltext
directives is a significant enhancement. It's important to ensure that the fields specified in the @@fulltext
directive (x
, y
, z
) are of a type that supports full-text search. Given that they are defined as String
, this aligns with expectations for full-text search capabilities. However, it would be beneficial to include tests that specifically validate the behavior and functionality of these @@fulltext
directives to ensure they work as expected in various scenarios.
Would you like me to help generate specific test cases for validating the @@fulltext
directive functionality?
@@ -352,6 +371,7 @@ describe('Attribute tests', () => { | |||
_longBlob Bytes @db.LongBlob | |||
_binary Bytes @db.Binary | |||
_varBinary Bytes @db.VarBinary | |||
_varBinarySized Bytes @db.VarBinary(100) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The introduction of the _varBinarySized
field with a specified size (@db.VarBinary(100)
) is a good practice for fields where the size can be anticipated, as it can help optimize storage and performance. However, it's crucial to ensure that the specified size aligns with the expected usage patterns and data sizes for this field. If this field is expected to store data that could exceed 100 bytes, consider adjusting the size accordingly or documenting the rationale behind the chosen size limit.
Consider validating or documenting the choice of size for the _varBinarySized
field to ensure it aligns with anticipated data storage needs.
} catch { | ||
await this.trackPrismaSchemaError(outFile); | ||
try { | ||
// run 'prisma generate' again with output to the console | ||
await execSync(generateCmd); | ||
await execPackage(generateCmd); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Attempting to run prisma generate
again with output to the console after a failure is a good fallback strategy. However, consider logging the error or providing more detailed feedback to the user or developer about what went wrong during the first attempt.
/** | ||
* Utility for running package commands through npx/bunx | ||
*/ | ||
export function execPackage(cmd: string, options?: Omit<ExecSyncOptions, 'env'> & { env?: Record<string, string> }): void { | ||
const packageManager = process?.versions?.bun ? 'bunx' : 'npx'; | ||
execSync(`${packageManager} ${cmd}`, options) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The addition of the execPackage
utility function is a valuable enhancement, providing a flexible way to execute package commands with either npx
or bunx
based on the environment. This approach aligns with modern development practices and supports the use of alternative package managers like bun
. Consider adding a brief comment explaining the rationale behind choosing between npx
and bunx
for future maintainers.
+ // Choose 'bunx' if 'bun' is present in the environment, otherwise default to 'npx'
const packageManager = process?.versions?.bun ? 'bunx' : 'npx';
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
/** | |
* Utility for running package commands through npx/bunx | |
*/ | |
export function execPackage(cmd: string, options?: Omit<ExecSyncOptions, 'env'> & { env?: Record<string, string> }): void { | |
const packageManager = process?.versions?.bun ? 'bunx' : 'npx'; | |
execSync(`${packageManager} ${cmd}`, options) | |
} | |
/** | |
* Utility for running package commands through npx/bunx | |
*/ | |
export function execPackage(cmd: string, options?: Omit<ExecSyncOptions, 'env'> & { env?: Record<string, string> }): void { | |
// Choose 'bunx' if 'bun' is present in the environment, otherwise default to 'npx' | |
const packageManager = process?.versions?.bun ? 'bunx' : 'npx'; | |
execSync(`${packageManager} ${cmd}`, options) | |
} |
const importPath = imp.path.endsWith('.zmodel') ? imp.path : `${imp.path}.zmodel`; | ||
if (!importedModel) { | ||
accept('error', `Cannot find model file ${imp.path}.zmodel`, { node: imp }); | ||
accept('error', `Cannot find model file ${importPath}`, { node: imp }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The update to the validateImports
method in the SchemaValidator
class, ensuring the import path is correctly formatted with the .zmodel
extension, is a practical improvement. This change enhances the reliability of import validation by standardizing the import path format. Consider adding a comment explaining the rationale behind conditionally appending the .zmodel
extension for future maintainers.
+ // Ensure the import path ends with '.zmodel' for consistent path resolution
const importPath = imp.path.endsWith('.zmodel') ? imp.path : `${imp.path}.zmodel`;
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
const importPath = imp.path.endsWith('.zmodel') ? imp.path : `${imp.path}.zmodel`; | |
if (!importedModel) { | |
accept('error', `Cannot find model file ${imp.path}.zmodel`, { node: imp }); | |
accept('error', `Cannot find model file ${importPath}`, { node: imp }); | |
// Ensure the import path ends with '.zmodel' for consistent path resolution | |
const importPath = imp.path.endsWith('.zmodel') ? imp.path : `${imp.path}.zmodel`; | |
if (!importedModel) { | |
accept('error', `Cannot find model file ${importPath}`, { node: imp }); |
if (!process?.stdout?.isTTY && process?.versions?.bun) { | ||
console.error('REPL on Bun is only available in a TTY terminal at this time. Please use npm/npx to run the command in this context instead of bun/bunx.'); | ||
return; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The updates to the repl.ts
file, including the environment check before initializing the REPL session and moving the prettyRepl
import inside the repl
function, are thoughtful improvements. These changes enhance the CLI's usability and performance by ensuring compatibility and optimizing the import strategy. Consider adding comments explaining the environment check's purpose and the rationale behind moving the prettyRepl
import for future maintainers.
+ // Check for a specific environment to ensure REPL compatibility
if (!process?.stdout?.isTTY && process?.versions?.bun) {
+ // Import 'prettyRepl' inside the function to optimize performance
const prettyRepl = await import('pretty-repl')
Also applies to: 18-18
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
if (!process?.stdout?.isTTY && process?.versions?.bun) { | |
console.error('REPL on Bun is only available in a TTY terminal at this time. Please use npm/npx to run the command in this context instead of bun/bunx.'); | |
return; | |
} | |
// Check for a specific environment to ensure REPL compatibility | |
if (!process?.stdout?.isTTY && process?.versions?.bun) { | |
console.error('REPL on Bun is only available in a TTY terminal at this time. Please use npm/npx to run the command in this context instead of bun/bunx.'); | |
return; | |
} | |
// Import 'prettyRepl' inside the function to optimize performance | |
const prettyRepl = await import('pretty-repl') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 1
Configuration used: CodeRabbit UI
Files ignored due to path filters (5)
package.json
is excluded by:!**/*.json
packages/testtools/package.json
is excluded by:!**/*.json
packages/testtools/src/package.template.json
is excluded by:!**/*.json
pnpm-lock.yaml
is excluded by:!**/*.yaml
tests/integration/package.json
is excluded by:!**/*.json
Files selected for processing (11)
- .github/workflows/build-test.yml (1 hunks)
- .gitignore (1 hunks)
- CONTRIBUTING.md (1 hunks)
- jest.config.ts (1 hunks)
- packages/testtools/src/schema.ts (5 hunks)
- script/test-scaffold.ts (1 hunks)
- test-setup.ts (1 hunks)
- tests/integration/jest.config.ts (1 hunks)
- tests/integration/tests/cli/generate.test.ts (2 hunks)
- tests/integration/tests/cli/plugins.test.ts (2 hunks)
- tests/integration/tests/schema/todo.zmodel (1 hunks)
Files skipped from review due to trivial changes (2)
- .github/workflows/build-test.yml
- .gitignore
Additional comments: 13
tests/integration/jest.config.ts (1)
- 1-8: The changes to
jest.config.ts
effectively leverage thebaseConfig
to extend the Jest configuration and correctly specify a setup file withsetupFilesAfterEnv
. This approach maintains configuration consistency and ensures any necessary setup is performed after the environment is initialized.test-setup.ts (1)
- 1-9: The setup function in
test-setup.ts
correctly checks for the necessary precondition and provides clear instructions on how to proceed if the precondition is not met. This ensures that tests run in a correctly initialized environment.script/test-scaffold.ts (1)
- 1-24: The
test-scaffold.ts
script is well-organized and effectively sets up a clean test environment by managing directories and installing necessary dependencies. The encapsulation of command execution in therun
function enhances maintainability and reusability.jest.config.ts (1)
- 6-18: The addition of the
path
module and its use in configuringglobalSetup
andcoverageDirectory
injest.config.ts
correctly enhances cross-platform compatibility and ensures a global setup is executed before test suites. This is a well-implemented enhancement to the testing configuration.CONTRIBUTING.md (1)
- 38-45: The addition of a scaffolding step in the
CONTRIBUTING.md
file is a valuable update, providing clear instructions to contributors on setting up the project for testing. This enhances the contribution guide by making the initial setup process more transparent and straightforward.tests/integration/tests/schema/todo.zmodel (1)
- 16-16: Commenting out
preserveTsFiles = true
in thetodo.zmodel
file changes the default behavior of the Zod plugin regarding TypeScript file preservation. Contributors should be aware of how this might affect their development workflow, especially in terms of type safety and code generation.tests/integration/tests/cli/generate.test.ts (1)
- 46-47: Refactoring package installation in
generate.test.ts
to use theinstallPackage
function simplifies the setup process and enhances code readability and maintainability. This is a positive change that improves the quality of the test setup code.tests/integration/tests/cli/plugins.test.ts (1)
- 97-98: Updating the package installation logic in
cli/plugins.test.ts
to useinstallPackage
for both dependencies and dev dependencies standardizes and clarifies the setup process. This change enhances the maintainability and accuracy of the test environment setup.packages/testtools/src/schema.ts (5)
- 38-50: The error handling in the
run
function has been improved by catching exceptions and logging them before re-throwing. This approach ensures that errors during command execution are not silently ignored, providing better visibility into failures.- 53-54: The introduction of the
installPackage
function enhances modularity by encapsulating the logic for package installation into a reusable function. This change promotes code reuse and simplifies package management tasks within the tool.- 102-102: Commenting out the
preserveTsFiles
option in thezod
plugin configuration might have implications on TypeScript file handling during schema generation. Ensure this change aligns with the intended behavior for TypeScript file preservation across different environments and use cases.- 208-214: The modifications to the
run
command for schema generation with custom schema file paths and output directories enhance flexibility in schema management. It's important to verify that these changes integrate well with existing workflows and do not introduce any regressions in schema generation or application behavior.Verification successful
The modifications to the
run
command for schema generation with custom schema file paths and output directories appear to be well-integrated with existing workflows, as evidenced by their usage in tests for various server adapters. This suggests that the changes are likely to enhance flexibility in schema management without introducing regressions in schema generation or application behavior.* 225-228: Adding logic to install extra dependencies based on the options provided is a useful enhancement. This change allows for more dynamic and configurable testing environments. However, ensure that the dependencies being installed are validated and necessary for the intended use cases to avoid potential issues with package conflicts or unnecessary installations.Scripts Executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Verify integration of custom schema paths and output directories with existing workflows rg --type ts --type js "loadSchema" --exec cat {} | grep -E "customSchemaFilePath|output"Length of output: 500
Script:
#!/bin/bash # Corrected script to verify integration of custom schema paths and output directories with existing workflows rg --type ts --type js "loadSchema" | grep -E "customSchemaFilePath|output"Length of output: 553
const workspaceRoot = getWorkspaceRoot(__dirname); | ||
|
||
if (!root) { | ||
if (!workspaceRoot) { | ||
throw new Error('Could not find workspace root'); | ||
} | ||
|
||
const pkgContent = fs.readFileSync(path.join(__dirname, 'package.template.json'), { encoding: 'utf-8' }); | ||
fs.writeFileSync(path.join(projectRoot, 'package.json'), pkgContent.replaceAll('<root>', root)); | ||
|
||
const npmrcContent = fs.readFileSync(path.join(__dirname, '.npmrc.template'), { encoding: 'utf-8' }); | ||
fs.writeFileSync(path.join(projectRoot, '.npmrc'), npmrcContent.replaceAll('<root>', root)); | ||
|
||
console.log('Workdir:', projectRoot); | ||
process.chdir(projectRoot); | ||
|
||
// copy project structure from scaffold (prepared by test-setup.ts) | ||
fs.cpSync(path.join(workspaceRoot, '.test/scaffold'), projectRoot, { recursive: true, force: true }); | ||
|
||
// install local deps | ||
const localInstallDeps = [ | ||
'packages/schema/dist', | ||
'packages/runtime/dist', | ||
'packages/plugins/swr/dist', | ||
'packages/plugins/trpc/dist', | ||
'packages/plugins/openapi/dist', | ||
]; | ||
|
||
run(`npm i --no-audit --no-fund ${localInstallDeps.map((d) => path.join(workspaceRoot, d)).join(' ')}`); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tip
Codebase Verification
The verification process has confirmed that the directories mentioned in the review comment (.test/scaffold
, packages/schema/dist
, packages/runtime/dist
, packages/plugins/swr/dist
, packages/plugins/trpc/dist
, and packages/plugins/openapi/dist
) do not exist in the project's file structure. This directly addresses the concerns raised in the review comment regarding the setup of paths and the accuracy of the local dependencies listed for the tests.
Given this information, it's clear that the initial review comment highlighting the importance of ensuring the correct setup of paths and the necessity of the listed local dependencies for the tests was valid. The absence of these directories could potentially impact the setup process for testing environments as described in the code snippet.
Analysis chain
The refactoring around workspace root handling and the addition of logic to copy project structure and install local dependencies significantly improve the setup process for testing environments. However, it's crucial to ensure that the paths used (e.g., .test/scaffold
) are correctly set up in the project and that the local dependencies listed are accurate and necessary for the tests.
Scripts Executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Verify the existence of the scaffold directory and local dependency paths
fd --exec test -d {} .test/scaffold
fd --exec test -d {} packages/schema/dist
fd --exec test -d {} packages/runtime/dist
fd --exec test -d {} packages/plugins/swr/dist
fd --exec test -d {} packages/plugins/trpc/dist
fd --exec test -d {} packages/plugins/openapi/dist
Length of output: 140258
Script:
#!/bin/bash
# Verify the existence of the scaffold directory and local dependency paths using a corrected approach
if [ -d ".test/scaffold" ]; then
echo ".test/scaffold exists"
else
echo ".test/scaffold does not exist"
fi
if [ -d "packages/schema/dist" ]; then
echo "packages/schema/dist exists"
else
echo "packages/schema/dist does not exist"
fi
if [ -d "packages/runtime/dist" ]; then
echo "packages/runtime/dist exists"
else
echo "packages/runtime/dist does not exist"
fi
if [ -d "packages/plugins/swr/dist" ]; then
echo "packages/plugins/swr/dist exists"
else
echo "packages/plugins/swr/dist does not exist"
fi
if [ -d "packages/plugins/trpc/dist" ]; then
echo "packages/plugins/trpc/dist exists"
else
echo "packages/plugins/trpc/dist does not exist"
fi
if [ -d "packages/plugins/openapi/dist" ]; then
echo "packages/plugins/openapi/dist exists"
else
echo "packages/plugins/openapi/dist does not exist"
fi
Length of output: 734
Summary by CodeRabbit
@@index
attribute usage and auto-completion within attributes.map: String?
for the@@fulltext
attribute and_ x: Int?
for@db.VarBinary
.execPackage
for executing package commands.useModelQuery
anduseInfiniteModelQuery
functions in Vue to excludequeryKey
from options.@id
attribute and import paths in models.execPackage
fromexecSync
for executing Prisma commands.SchemaLoadingError
to handle an array of error types and refactored error handling logic within tests.