Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add generators in syncing dimensions #638

Merged
merged 6 commits into from
Oct 1, 2024
Merged

Conversation

Ashutosh619-sudo
Copy link
Contributor

@Ashutosh619-sudo Ashutosh619-sudo commented Sep 6, 2024

https://app.clickup.com/1864988/v/l/6-901603904304-1

Summary by CodeRabbit

Summary by CodeRabbit

  • New Features

    • Improved data synchronization methods for currencies, locations, classifications, departments, and subsidiaries, enhancing memory efficiency and performance.
  • Bug Fixes

    • Resolved issues related to handling large datasets, ensuring smoother processing without memory overload.
  • Tests

    • Updated test cases to align with the new generator-based data retrieval methods, improving test accuracy and reliability.
  • Documentation

    • Adjusted data structure in test fixtures for locations to a nested list format, requiring updates in downstream data handling.

Copy link

coderabbitai bot commented Sep 6, 2024

Walkthrough

The changes involve a transition from synchronous data retrieval to a generator-based approach in the sync_currencies, sync_locations, sync_classifications, sync_departments, and sync_subsidiaries methods of the apps/netsuite/connector.py file. Additionally, the data structure returned by the get_all_locations function in the test fixtures has been modified to a nested list format, and the test functions have been updated to reflect these changes.

Changes

File Path Change Summary
apps/netsuite/connector.py Replaced synchronous data retrieval with a generator-based approach in multiple sync methods.
tests/test_netsuite/fixtures.py Modified get_all_locations to return a nested list structure instead of a list of dictionaries.
tests/test_netsuite/test_connector.py Updated test functions to mock get_all_generator instead of get_all, ensuring alignment with the new API structure.

Sequence Diagram(s)

sequenceDiagram
    participant Client
    participant Connector
    participant Generator

    Client->>Connector: Request data sync
    Connector->>Generator: Call get_all_generator()
    loop For each batch
        Generator-->>Connector: Yield batch of items
        loop For each item in batch
            Connector->>Connector: Process item
        end
    end
    Connector-->>Client: Sync complete
Loading

🐰 "In the meadow, changes bloom,
A hop, a skip, to clear the room.
With each item, one by one,
Memory saved, the work is fun!
Batches dance, like petals in air,
Oh, what joy, to sync with care!" 🐇


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Files that changed from the base of the PR and between 3193f81 and a55e222.

📒 Files selected for processing (1)
  • apps/netsuite/connector.py (3 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • apps/netsuite/connector.py

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

github-actions bot commented Sep 6, 2024

Tests Skipped Failures Errors Time
222 0 💤 2 ❌ 0 🔥 25.489s ⏱️

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 3998b61 and 77dc2bd.

Files selected for processing (1)
  • apps/netsuite/connector.py (5 hunks)
Additional comments not posted (4)
apps/netsuite/connector.py (4)

561-574: Review: Implementation of generator in sync_classifications method.

The method sync_classifications uses a generator (classification_generator) effectively to handle potentially large sets of classification data. The structure and logic of the method are consistent with the changes in other syncing methods, which is good for maintaining uniformity across the module.

Ensure that the bulk_create_or_update_destination_attributes function is capable of handling the data efficiently without performance issues, especially when dealing with large datasets.


496-510: Review: Implementation of generator in sync_currencies method.

The transition to using a generator (currencies_generator) for fetching currencies is implemented correctly. The method iterates over batches of currencies, processing each currency individually within the nested loop. This approach should indeed help in handling large datasets more efficiently by not loading all data into memory at once.

However, ensure that the bulk_create_or_update_destination_attributes function is optimized to handle potentially large batches of data without performance degradation.


585-598: Review: Implementation of generator in sync_departments method.

The implementation of department_generator in the sync_departments method follows the established pattern of using generators for efficient data handling. The method processes departments in batches, which should help in managing memory usage effectively.

As with other methods, verify that the bulk_create_or_update_destination_attributes function is optimized for handling large volumes of data efficiently.


931-945: Review: Implementation of generator in sync_subsidiaries method.

The sync_subsidiaries method correctly uses a generator (subsidiary_generator) to fetch subsidiaries in batches. This approach is consistent with the other changes in the module and is effective for handling large datasets without loading all data into memory.

The method processes each subsidiary individually within the nested loop, which is good for memory efficiency. Ensure that the bulk_create_or_update_destination_attributes function is capable of handling these operations efficiently.

Comment on lines 523 to 543
location_generator = self.connection.locations.get_all_generator()


location_attributes = []

for location in locations:
if not location['isInactive']:
if 'subsidiaryList' in location and location['subsidiaryList']:
subsidiaries = location['subsidiaryList']['recordRef']
counter = 0
if subsidiaries[counter]['internalId'] == subsidiary_mapping.internal_id:
counter += 1
for locations in location_generator:
for location in locations:
if not location['isInactive']:
if 'subsidiaryList' in location and location['subsidiaryList']:
subsidiaries = location['subsidiaryList']['recordRef']
counter = 0
if subsidiaries[counter]['internalId'] == subsidiary_mapping.internal_id:
counter += 1
location_attributes.append({
'attribute_type': 'LOCATION',
'display_name': 'Location',
'value': location['name'],
'destination_id': location['internalId'],
'active': not location['isInactive']
})
else:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review: Implementation of generator in sync_locations method.

The use of location_generator to fetch locations in batches is correctly implemented. The nested loop structure allows for incremental processing of each location, which is beneficial for memory management. The additional logic to check for active locations and subsidiary matching is preserved, which is good for maintaining the existing functionality.

One potential issue is the use of a counter (counter) which seems to be intended for iterating over subsidiaries but is not incremented in a loop. This might be a bug or a misunderstanding in the code logic.

Consider revising the logic around the counter variable to ensure it functions as intended, or remove it if it's not necessary.

-                        counter = 0
-                        if subsidiaries[counter]['internalId'] == subsidiary_mapping.internal_id:
-                            counter += 1
+                        for subsidiary in subsidiaries:
+                            if subsidiary['internalId'] == subsidiary_mapping.internal_id:
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
location_generator = self.connection.locations.get_all_generator()
location_attributes = []
for location in locations:
if not location['isInactive']:
if 'subsidiaryList' in location and location['subsidiaryList']:
subsidiaries = location['subsidiaryList']['recordRef']
counter = 0
if subsidiaries[counter]['internalId'] == subsidiary_mapping.internal_id:
counter += 1
for locations in location_generator:
for location in locations:
if not location['isInactive']:
if 'subsidiaryList' in location and location['subsidiaryList']:
subsidiaries = location['subsidiaryList']['recordRef']
counter = 0
if subsidiaries[counter]['internalId'] == subsidiary_mapping.internal_id:
counter += 1
location_attributes.append({
'attribute_type': 'LOCATION',
'display_name': 'Location',
'value': location['name'],
'destination_id': location['internalId'],
'active': not location['isInactive']
})
else:
location_generator = self.connection.locations.get_all_generator()
location_attributes = []
for locations in location_generator:
for location in locations:
if not location['isInactive']:
if 'subsidiaryList' in location and location['subsidiaryList']:
subsidiaries = location['subsidiaryList']['recordRef']
for subsidiary in subsidiaries:
if subsidiary['internalId'] == subsidiary_mapping.internal_id:
location_attributes.append({
'attribute_type': 'LOCATION',
'display_name': 'Location',
'value': location['name'],
'destination_id': location['internalId'],
'active': not location['isInactive']
})
else:

Copy link

github-actions bot commented Sep 6, 2024

Tests Skipped Failures Errors Time
222 0 💤 0 ❌ 0 🔥 24.830s ⏱️

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 77dc2bd and 0b82f13.

Files selected for processing (3)
  • apps/netsuite/connector.py (5 hunks)
  • tests/test_netsuite/fixtures.py (2 hunks)
  • tests/test_netsuite/test_connector.py (3 hunks)
Files skipped from review as they are similar to previous changes (1)
  • apps/netsuite/connector.py
Additional comments not posted (4)
tests/test_netsuite/test_connector.py (3)

408-408: Approved: Transition to generator-based method.

The change from get_all to get_all_generator in test_sync_currencies aligns with the PR's objectives to use generators for syncing dimensions. Ensure that the integration with the rest of the system is seamless and that all dependent functionalities are updated accordingly.

Run the following script to verify the integration:

Verification successful

Verified: Transition to get_all_generator in test_sync_currencies.

The test_sync_currencies function in tests/test_netsuite/test_connector.py correctly mocks the get_all_generator method, confirming the transition to a generator-based approach. This change aligns with the PR's objectives, and no issues were found with this specific update. Ensure that any other dependencies on the old method are also updated if necessary.

Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the integration of `get_all_generator` in `test_sync_currencies`.

# Test: Search for the function usage. Expect: Only occurrences of the new signature.
rg --type python -A 5 $'test_sync_currencies'

Length of output: 75


Script:

#!/bin/bash
# Description: Verify the integration of `get_all_generator` in `test_sync_currencies`.

# Test: Search for the function usage. Expect: Only occurrences of the new signature.
rg --type py -A 5 $'test_sync_currencies'

Length of output: 524


328-328: Approved: Transition to generator-based method.

The change from get_all to get_all_generator in test_sync_locations aligns with the PR's objectives to use generators for syncing dimensions. Ensure that the integration with the rest of the system is seamless and that all dependent functionalities are updated accordingly.

Run the following script to verify the integration:

Verification successful

Verified: Successful integration of get_all_generator.

The transition to using get_all_generator is well-integrated across the codebase, including in test files such as tests/test_netsuite/test_connector.py. This confirms that the generator-based approach is being thoroughly tested and utilized. No further issues were found regarding this integration.

Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the integration of `get_all_generator` in `test_sync_locations`.

# Test: Search for the function usage. Expect: Only occurrences of the new signature.
rg --type python -A 5 $'test_sync_locations'

Length of output: 74


Script:

#!/bin/bash
# Description: Verify the integration of `get_all_generator` in the codebase.

# Corrected Test: Search for the usage of `get_all_generator` in Python files.
rg 'get_all_generator' --type py -A 5

Length of output: 19655


363-363: Approved: Transition to generator-based method.

The change from get_all to get_all_generator in test_sync_customers aligns with the PR's objectives to use generators for syncing dimensions. Ensure that the integration with the rest of the system is seamless and that all dependent functionalities are updated accordingly.

Run the following script to verify the integration:

Verification successful

Verified: Integration of get_all_generator in test_sync_customers.

The test_sync_customers function in tests/test_netsuite/test_connector.py correctly uses the get_all_generator method, confirming the transition to a generator-based approach. This aligns with the PR's objectives, and no issues were found with the integration.

  • Location: tests/test_netsuite/test_connector.py
Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the integration of `get_all_generator` in `test_sync_customers`.

# Test: Search for the function usage. Expect: Only occurrences of the new signature.
rg --type python -A 5 $'test_sync_customers'

Length of output: 74


Script:

#!/bin/bash
# Description: Verify the integration of `get_all_generator` in `test_sync_customers`.

# Test: Search for the function usage. Expect: Only occurrences of the new signature.
rg --type py -A 5 $'test_sync_customers'

Length of output: 442

tests/test_netsuite/fixtures.py (1)

Line range hint 6557-6760: Ensure proper handling of nested list structure in get_all_locations.

The function get_all_locations has been modified to return a nested list structure. It is crucial to ensure that:

  • The new data structure is correctly implemented.
  • All downstream processes that consume this data are updated to handle the nested list structure.
  • Adequate error handling and data validation are in place to manage the nested lists effectively.

Please verify that all integrations and dependent modules have been updated to accommodate this change.

'active': True
}
)
for currency in currencies_generator:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you test this once?

I think this not how generator works
You will have to two for loops

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link

PR title must start with "fix:", "feat:", "chore:", "refactor", or "test:" (case-insensitive)

Copy link

PR description must contain a link to a ClickUp (case-insensitive)

Copy link

Tests Skipped Failures Errors Time
222 0 💤 0 ❌ 0 🔥 24.799s ⏱️

@Ashutosh619-sudo Ashutosh619-sudo added the deploy Triggers deployment of active branch to Staging label Sep 23, 2024
Copy link

PR title must start with "fix:", "feat:", "chore:", "refactor", or "test:" (case-insensitive)

Copy link

PR description must contain a link to a ClickUp (case-insensitive)

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 0b82f13 and 3193f81.

Files selected for processing (1)
  • apps/netsuite/connector.py (5 hunks)
Additional comments not posted (5)
apps/netsuite/connector.py (5)

496-509: Improved efficiency in sync_currencies method

The modification to use a generator (get_all_generator()) instead of retrieving all currencies at once is a good improvement. This change can significantly reduce memory usage, especially when dealing with large datasets, as it allows for processing currencies in batches rather than loading all of them into memory at once.

The core logic for creating currency attributes remains unchanged, ensuring that the functionality is preserved while improving efficiency.


559-572: Improved efficiency in sync_classifications method

The modification to use a generator (get_all_generator()) for retrieving classifications is a good improvement. This change enhances memory efficiency, especially when dealing with large datasets, as it allows for processing classifications in batches rather than loading all of them into memory at once.

The core logic for creating classification attributes remains unchanged, ensuring that the functionality is preserved while improving efficiency. The nested loop structure correctly handles the batches of classifications provided by the generator.


583-596: Improved efficiency in sync_departments method

The change to use a generator (get_all_generator()) for retrieving departments is a positive improvement. This modification enhances memory efficiency, particularly when dealing with large datasets, as it allows for processing departments in batches rather than loading all of them into memory at once.

The core logic for creating department attributes remains unchanged, ensuring that the functionality is preserved while improving efficiency. The nested loop structure correctly handles the batches of departments provided by the generator.


929-943: Improved efficiency in sync_subsidiaries method

The transition to using a generator (get_all_generator()) for retrieving subsidiaries is a commendable improvement. This change enhances memory efficiency, particularly when dealing with large datasets, as it allows for processing subsidiaries in batches rather than loading all of them into memory at once.

The core logic for creating subsidiary attributes remains unchanged, ensuring that the functionality is preserved while improving efficiency. The nested loop structure correctly handles the batches of subsidiaries provided by the generator.


Line range hint 1-2155: Overall improvement in sync methods, with one minor issue

The changes made to the sync methods (sync_currencies, sync_locations, sync_classifications, sync_departments, and sync_subsidiaries) in this file consistently improve memory efficiency by transitioning from retrieving all records at once to using generators. This approach is particularly beneficial when dealing with large datasets, as it allows for processing data in batches.

The core logic for creating attributes in each method remains largely unchanged, which helps maintain the existing functionality while improving performance.

However, there is a minor issue in the sync_locations method regarding the usage of a counter variable, which should be addressed to ensure correct functionality.

Overall, these changes represent a positive step towards more efficient data synchronization with NetSuite.

Comment on lines +522 to +541
location_generator = self.connection.locations.get_all_generator()

location_attributes = []

for location in locations:
if not location['isInactive']:
if 'subsidiaryList' in location and location['subsidiaryList']:
subsidiaries = location['subsidiaryList']['recordRef']
counter = 0
if subsidiaries[counter]['internalId'] == subsidiary_mapping.internal_id:
counter += 1
for locations in location_generator:
for location in locations:
if not location['isInactive']:
if 'subsidiaryList' in location and location['subsidiaryList']:
subsidiaries = location['subsidiaryList']['recordRef']
counter = 0
if subsidiaries[counter]['internalId'] == subsidiary_mapping.internal_id:
counter += 1
location_attributes.append({
'attribute_type': 'LOCATION',
'display_name': 'Location',
'value': location['name'],
'destination_id': location['internalId'],
'active': not location['isInactive']
})
else:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Improved sync_locations method, but check counter logic

The transition to using a generator (get_all_generator()) for retrieving locations is a good improvement for memory efficiency, especially when dealing with large datasets. The nested loop structure allows for incremental processing of each location, which is beneficial.

However, there's an issue with the counter variable:

  1. It's initialized before the inner loop but incremented inside it.
  2. It's not reset for each new batch of locations from the generator.
  3. It's only incremented when a condition is met, which may lead to incorrect indexing.

Consider revising the logic around the counter variable. If it's meant to iterate over subsidiaries, you might want to replace it with a loop over the subsidiaries list. For example:

for location in locations:
    if not location['isInactive']:
        if 'subsidiaryList' in location and location['subsidiaryList']:
            subsidiaries = location['subsidiaryList']['recordRef']
            for subsidiary in subsidiaries:
                if subsidiary['internalId'] == subsidiary_mapping.internal_id:
                    location_attributes.append({
                        'attribute_type': 'LOCATION',
                        'display_name': 'Location',
                        'value': location['name'],
                        'destination_id': location['internalId'],
                        'active': not location['isInactive']
                    })
                    break  # Assuming you only want to add the location once if any subsidiary matches
        else:
            # Existing code for locations without subsidiaryList

This approach ensures that all subsidiaries are checked for each location, and the location is added if any subsidiary matches.

Copy link

PR title must start with "fix:", "feat:", "chore:", "refactor", or "test:" (case-insensitive)

@github-actions github-actions bot added the size/M Medium PR label Sep 27, 2024
Copy link

PR description must contain a link to a ClickUp (case-insensitive)

Copy link

Tests Skipped Failures Errors Time
222 0 💤 0 ❌ 0 🔥 25.891s ⏱️

@Ashutosh619-sudo Ashutosh619-sudo changed the title Add generators in syncing dimensions feat: Add generators in syncing dimensions Sep 27, 2024
Copy link

PR description must contain a link to a ClickUp (case-insensitive)

Copy link

PR description must contain a link to a ClickUp (case-insensitive)

@Ashutosh619-sudo Ashutosh619-sudo changed the title feat: Add generators in syncing dimensions feat| Add generators in syncing dimensions Sep 27, 2024
@Ashutosh619-sudo Ashutosh619-sudo changed the title feat| Add generators in syncing dimensions feat | Add generators in syncing dimensions Sep 27, 2024
Copy link

PR title must start with "fix:", "feat:", "chore:", "refactor", or "test:" (case-insensitive)

Copy link

PR description must contain a link to a ClickUp (case-insensitive)

Copy link

PR title must start with "fix:", "feat:", "chore:", "refactor", or "test:" (case-insensitive)

Copy link

PR description must contain a link to a ClickUp (case-insensitive)

Copy link

PR title must start with "fix:", "feat:", "chore:", "refactor", or "test:" (case-insensitive)

@Ashutosh619-sudo Ashutosh619-sudo changed the title feat | Add generators in syncing dimensions feat: Add generators in syncing dimensions Sep 27, 2024
Copy link

Tests Skipped Failures Errors Time
222 0 💤 0 ❌ 0 🔥 26.461s ⏱️

'active': True
}
)
for currency in currencies_generator:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'destination_id': currency['internalId'],
'active': True
}
)

DestinationAttribute.bulk_create_or_update_destination_attributes(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

upserting to DB should also be per page

'value': location['name'],
'destination_id': location['internalId'],
'active': not location['isInactive']
})

DestinationAttribute.bulk_create_or_update_destination_attributes(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

upserting to DB should also be per page

'value': department['name'],
'destination_id': department['internalId'],
'active': not department['isInactive']
})

DestinationAttribute.bulk_create_or_update_destination_attributes(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

take care everywhere

@@ -405,7 +405,7 @@ def test_sync_tax_items(mocker, db):

def test_sync_currencies(mocker, db):
mocker.patch(
'netsuitesdk.api.currencies.Currencies.get_all',
'netsuitesdk.api.currencies.Currencies.get_all_generator',
return_value=data['get_all_currencies'][0]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

response structure would change

@@ -325,7 +325,7 @@ def test_sync_subsidiaries(mocker, db):

def test_sync_locations(mocker, db):
mocker.patch(
'netsuitesdk.api.locations.Locations.get_all',
'netsuitesdk.api.locations.Locations.get_all_generator',
return_value=data['get_all_locations']
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

response structure would change

Copy link

Tests Skipped Failures Errors Time
222 0 💤 0 ❌ 0 🔥 25.309s ⏱️

Copy link

codecov bot commented Sep 30, 2024

Codecov Report

Attention: Patch coverage is 83.87097% with 5 lines in your changes missing coverage. Please review.

Project coverage is 88.70%. Comparing base (5d5cdf2) to head (a55e222).
Report is 36 commits behind head on master.

Files with missing lines Patch % Lines
apps/netsuite/connector.py 83.87% 5 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #638      +/-   ##
==========================================
- Coverage   89.31%   88.70%   -0.61%     
==========================================
  Files          59       59              
  Lines        5193     5382     +189     
==========================================
+ Hits         4638     4774     +136     
- Misses        555      608      +53     
Flag Coverage Δ
88.70% <83.87%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
apps/netsuite/connector.py 88.79% <83.87%> (+0.07%) ⬆️

... and 4 files with indirect coverage changes

@Ashutosh619-sudo Ashutosh619-sudo merged commit 0351989 into master Oct 1, 2024
4 of 6 checks passed
Ashutosh619-sudo added a commit that referenced this pull request Oct 1, 2024
* Add generators in syncing dimensions

* resolved tests

* comment resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
deploy Triggers deployment of active branch to Staging size/M Medium PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants