Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor job sync #193

Merged
merged 57 commits into from
Jun 21, 2024
Merged
Show file tree
Hide file tree
Changes from 53 commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
f6df1ec
Delete purchase_invoice, line_items on failed exports from hh2, type-…
Hrishabh17 May 21, 2024
372c628
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 21, 2024
d0ad337
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 24, 2024
7825b9e
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 27, 2024
b79c74c
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 27, 2024
97779c6
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 28, 2024
601e3cb
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 28, 2024
fb78432
Refactor deps schedule to run post Job import
Hrishabh17 Jun 3, 2024
57a4ed4
Fyle Card <> Vendor Mapping setup
Hrishabh17 Jun 3, 2024
19dfa7d
Added script to add mapping_settings
Hrishabh17 Jun 3, 2024
151e397
Fix post release script
Hrishabh17 Jun 4, 2024
ffbe1f5
Projects and Deps fields disable v1
Hrishabh17 Jun 4, 2024
372e5ab
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 4, 2024
737c79d
Remove dep setting trigger, add logger
Hrishabh17 Jun 4, 2024
da072df
Merge branch 'refactor-deps-import' into vendor-card-mapping
Hrishabh17 Jun 4, 2024
e2f4cef
Modified script, added additional test case
Hrishabh17 Jun 4, 2024
5bc9075
lint fix
Hrishabh17 Jun 4, 2024
c6a4697
Remove mock object
Hrishabh17 Jun 4, 2024
f667558
Merge branch 'refactor-deps-import' into vendor-card-mapping
Hrishabh17 Jun 4, 2024
1c2e3c1
Add details while logging
Hrishabh17 Jun 4, 2024
0e98a8a
Merge branch 'refactor-deps-import' into vendor-card-mapping
Hrishabh17 Jun 4, 2024
d23f0cd
modify post-release script
Hrishabh17 Jun 4, 2024
7dd6145
Merge branch 'vendor-card-mapping' into disable-sage-fields
Hrishabh17 Jun 4, 2024
f1a31f1
bump accounting-mapping version
Hrishabh17 Jun 4, 2024
505fabc
modify the variable_name, add conditional update
Hrishabh17 Jun 5, 2024
e85f6b2
Add example objects
Hrishabh17 Jun 5, 2024
b1f39ca
Added loggers
Hrishabh17 Jun 5, 2024
555a72c
Added test cases
Hrishabh17 Jun 5, 2024
675c37a
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 5, 2024
84ba2e9
Dependent Field optimizations
Hrishabh17 Jun 6, 2024
6dd28a9
fix failing test
Hrishabh17 Jun 6, 2024
aa96f62
Added Import Log for Deps
Hrishabh17 Jun 6, 2024
51e76e5
Fix post-release script
Hrishabh17 Jun 6, 2024
6cbcb7d
Merge branch 'dep-field-optimization' into dep-import-log
Hrishabh17 Jun 6, 2024
6c54fdc
fix failing test cases
Hrishabh17 Jun 6, 2024
1da8a7d
Set cost category import to fail on cost code fail
Hrishabh17 Jun 7, 2024
9aefa9f
Modify handle_import_exception for both class and func
Hrishabh17 Jun 7, 2024
173d31a
Modify test cases
Hrishabh17 Jun 7, 2024
5dac65e
fix ordering of saving import_log
Hrishabh17 Jun 7, 2024
ef99211
Move the import_log creation to ImportLog method
Hrishabh17 Jun 7, 2024
771c939
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 10, 2024
0d688ac
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 11, 2024
53d90b1
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 11, 2024
766beb3
Merged master and bumped accounting-mappings version
Hrishabh17 Jun 11, 2024
8393600
Add logger in import
Hrishabh17 Jun 12, 2024
7a6d4fa
Merge branch 'master' into dep-import-log
Hrishabh17 Jun 12, 2024
7a334fb
Refactor the job sync v1
Hrishabh17 Jun 13, 2024
369009f
Fix few sync issues
Hrishabh17 Jun 13, 2024
592dc6d
Merge branch 'master' into refactor-job-sync
Hrishabh17 Jun 13, 2024
f6abaaa
Remove extra loggers
Hrishabh17 Jun 13, 2024
0208231
fix lint
Hrishabh17 Jun 13, 2024
4048f92
Refactor Job sync v2
Hrishabh17 Jun 13, 2024
1b0209b
Add batch count
Hrishabh17 Jun 13, 2024
f384ba4
Fixed the import_log related comments
Hrishabh17 Jun 20, 2024
c3a0607
modify import logs save
Hrishabh17 Jun 20, 2024
852d69f
Merge branch 'master' into refactor-job-sync
Hrishabh17 Jun 20, 2024
352823b
Remove logger
Hrishabh17 Jun 21, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion apps/fyle/serializers.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from apps.workspaces.models import Workspace, FyleCredential
from apps.fyle.models import ExpenseFilter, DependentFieldSetting
from apps.fyle.helpers import get_expense_fields

from apps.mappings.imports.queues import chain_import_fields_to_fyle

logger = logging.getLogger(__name__)
logger.level = logging.INFO
Expand All @@ -39,6 +39,7 @@ def create(self, validated_data):
platform = PlatformConnector(fyle_credentials)

if refresh:
chain_import_fields_to_fyle(workspace_id=workspace_id)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use proper exception chaining to clarify exception sources.

When re-raising exceptions, it's recommended to use from to provide more context and avoid masking other exceptions.

- raise APIException("Internal Server Error", code='server_error')
+ raise APIException("Internal Server Error", code='server_error') from exception

Committable suggestion was skipped due to low confidence.

platform.import_fyle_dimensions()
workspace.source_synced_at = datetime.now()
workspace.save(update_fields=['source_synced_at'])
Expand Down
10 changes: 7 additions & 3 deletions apps/mappings/exceptions.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,12 @@


def handle_import_exceptions(func):
def new_fn(expense_attribute_instance, *args):
import_log: ImportLog = args[0]
def new_fn(expense_attribute_instance, *args, **kwargs):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor: Use f-strings for better readability.

The handle_import_exceptions function is well-structured for handling various exceptions. However, using f-strings can enhance readability and maintainability. Here's a suggested refactor:

- 'Import {0} to Fyle and Auto Create Mappings'.format(attribute_type)
+ f'Import {attribute_type} to Fyle and Auto Create Mappings'

- 'Invalid Token or Sage 300 credentials does not exist workspace_id - {0}'.format(workspace_id)
+ f'Invalid Token or Sage 300 credentials does not exist workspace_id - {workspace_id}'

Also applies to: 29-29, 43-43

import_log = None
if isinstance(expense_attribute_instance, ImportLog):
import_log: ImportLog = expense_attribute_instance
else:
import_log: ImportLog = args[0]
workspace_id = import_log.workspace_id
attribute_type = import_log.attribute_type
error = {
Expand All @@ -28,7 +32,7 @@ def new_fn(expense_attribute_instance, *args):
'response': None
}
try:
return func(expense_attribute_instance, *args)
return func(expense_attribute_instance, *args, **kwargs)
except WrongParamsError as exception:
error['message'] = exception.message
error['response'] = exception.response
Expand Down
Empty file added apps/mappings/helpers.py
Empty file.
7 changes: 7 additions & 0 deletions apps/mappings/imports/modules/base.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import math
import logging
from typing import List
from datetime import (
datetime,
Expand All @@ -20,6 +21,10 @@
from apps.accounting_exports.models import Error


logger = logging.getLogger(__name__)
logger.level = logging.INFO


class Base:
"""
The Base class for all the modules
Expand Down Expand Up @@ -299,6 +304,8 @@ def post_to_fyle_and_sync(self, fyle_payload: List[object], resource_class, is_l
:param is_last_batch: bool
:param import_log: ImportLog object
"""
logger.info("| Importing {} to Fyle | Content: {{WORKSPACE_ID: {} Fyle Payload count: {} is_last_batch: {}}}".format(self.destination_field, self.workspace_id, len(fyle_payload), is_last_batch))
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Optimize logging with f-strings.

Convert the logging statement to use f-strings for better performance and readability.

- logger.info("| Importing {} to Fyle | Content: {{WORKSPACE_ID: {} Fyle Payload count: {} is_last_batch: {}}}".format(self.destination_field, self.workspace_id, len(fyle_payload), is_last_batch))
+ logger.info(f"| Importing {self.destination_field} to Fyle | Content: {{WORKSPACE_ID: {self.workspace_id} Fyle Payload count: {len(fyle_payload)} is_last_batch: {is_last_batch}}}")
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
logger.info("| Importing {} to Fyle | Content: {{WORKSPACE_ID: {} Fyle Payload count: {} is_last_batch: {}}}".format(self.destination_field, self.workspace_id, len(fyle_payload), is_last_batch))
logger.info(f"| Importing {self.destination_field} to Fyle | Content: {{WORKSPACE_ID: {self.workspace_id} Fyle Payload count: {len(fyle_payload)} is_last_batch: {is_last_batch}}}")
Tools
Ruff

307-307: Use f-string instead of format call (UP032)

Convert to f-string


if fyle_payload and self.platform_class_name in ['expense_custom_fields', 'merchants']:
resource_class.post(fyle_payload)
elif fyle_payload:
Expand Down
41 changes: 24 additions & 17 deletions apps/mappings/imports/modules/projects.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
from datetime import datetime
from typing import List
from apps.mappings.imports.modules.base import Base
from apps.sage300.models import CostCategory
from fyle_accounting_mappings.models import DestinationAttribute


Expand Down Expand Up @@ -39,23 +40,29 @@ def construct_fyle_payload(
"""
payload = []

job_ids_in_cost_category = CostCategory.objects.filter(
workspace_id = self.workspace_id,
job_id__in = [attribute.destination_id for attribute in paginated_destination_attributes]
).values_list('job_id', flat=True).distinct()
Comment on lines +43 to +46
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider using a set comprehension for performance.

The current list comprehension inside the filter might be less efficient for larger datasets. Using a set comprehension can improve performance since membership tests in sets are faster.

- job_id__in=[attribute.destination_id for attribute in paginated_destination_attributes]
+ job_id__in={attribute.destination_id for attribute in paginated_destination_attributes}
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
job_ids_in_cost_category = CostCategory.objects.filter(
workspace_id = self.workspace_id,
job_id__in = [attribute.destination_id for attribute in paginated_destination_attributes]
).values_list('job_id', flat=True).distinct()
job_ids_in_cost_category = CostCategory.objects.filter(
workspace_id = self.workspace_id,
job_id__in = {attribute.destination_id for attribute in paginated_destination_attributes}
).values_list('job_id', flat=True).distinct()


for attribute in paginated_destination_attributes:
project = {
'name': attribute.value,
'code': attribute.destination_id,
'description': 'Sage 300 Project - {0}, Id - {1}'.format(
attribute.value,
attribute.destination_id
),
'is_enabled': True if attribute.active is None else attribute.active
}

# Create a new project if it does not exist in Fyle
if attribute.value.lower() not in existing_fyle_attributes_map:
payload.append(project)
# Disable the existing project in Fyle if auto-sync status is allowed and the destination_attributes is inactive
elif is_auto_sync_status_allowed and not attribute.active:
project['id'] = existing_fyle_attributes_map[attribute.value.lower()]
payload.append(project)
if attribute.destination_id in job_ids_in_cost_category:
project = {
'name': attribute.value,
'code': attribute.destination_id,
'description': 'Sage 300 Project - {0}, Id - {1}'.format(
attribute.value,
attribute.destination_id
),
'is_enabled': True if attribute.active is None else attribute.active
}
Comment on lines +53 to +58
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Optimize string formatting using f-strings.

Using f-strings can make the code cleaner and potentially faster.

- 'description': 'Sage 300 Project - {0}, Id - {1}'.format(attribute.value, attribute.destination_id)
+ 'description': f'Sage 300 Project - {attribute.value}, Id - {attribute.destination_id}'
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
'description': 'Sage 300 Project - {0}, Id - {1}'.format(
attribute.value,
attribute.destination_id
),
'is_enabled': True if attribute.active is None else attribute.active
}
'description': f'Sage 300 Project - {attribute.value}, Id - {attribute.destination_id}',
'is_enabled': True if attribute.active is None else attribute.active
Tools
Ruff

53-56: Use implicit references for positional format fields (UP030)

Remove explicit positional indices


53-56: Use f-string instead of format call (UP032)

Convert to f-string


# Create a new project if it does not exist in Fyle
if attribute.value.lower() not in existing_fyle_attributes_map:
payload.append(project)
# Disable the existing project in Fyle if auto-sync status is allowed and the destination_attributes is inactive
elif is_auto_sync_status_allowed and not attribute.active:
project['id'] = existing_fyle_attributes_map[attribute.value.lower()]
payload.append(project)

return payload
13 changes: 9 additions & 4 deletions apps/mappings/imports/queues.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
from fyle_accounting_mappings.models import MappingSetting
from apps.workspaces.models import ImportSetting
from apps.fyle.models import DependentFieldSetting
from apps.mappings.models import ImportLog


def chain_import_fields_to_fyle(workspace_id):
Expand All @@ -17,6 +18,13 @@ def chain_import_fields_to_fyle(workspace_id):

chain = Chain()

if project_mapping and dependent_field_settings:
cost_code_import_log = ImportLog.create('COST_CODE', workspace_id)
cost_category_import_log = ImportLog.create('COST_CATEGORY', workspace_id)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'JOB', workspace_id)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'COST_CODE', workspace_id, cost_code_import_log)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'COST_CATEGORY', workspace_id, cost_category_import_log)

if import_settings.import_categories:
chain.append(
'apps.mappings.imports.tasks.trigger_import_via_schedule',
Expand Down Expand Up @@ -52,10 +60,7 @@ def chain_import_fields_to_fyle(workspace_id):
)

if project_mapping and dependent_field_settings:
chain.append(
'apps.mappings.imports.tasks.auto_import_and_map_fyle_fields',
workspace_id
)
chain.append('apps.sage300.dependent_fields.import_dependent_fields_to_fyle', workspace_id)

if chain.length() > 0:
chain.run()
25 changes: 1 addition & 24 deletions apps/mappings/imports/tasks.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import logging
from django_q.tasks import Chain
# from django_q.tasks import Chain

from apps.mappings.models import ImportLog
from apps.mappings.imports.modules.categories import Category
Expand Down Expand Up @@ -36,26 +36,3 @@ def trigger_import_via_schedule(workspace_id: int, destination_field: str, sourc
module_class = SOURCE_FIELD_CLASS_MAP[source_field]
item = module_class(workspace_id, destination_field, sync_after)
item.trigger_import()


def auto_import_and_map_fyle_fields(workspace_id):
"""
Auto import and map fyle fields
"""
import_log = ImportLog.objects.filter(
workspace_id=workspace_id,
attribute_type = 'PROJECT'
).first()

chain = Chain()

chain.append('apps.mappings.tasks.sync_sage300_attributes', 'JOB', workspace_id)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'COST_CODE', workspace_id)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'COST_CATEGORY', workspace_id)
chain.append('apps.sage300.dependent_fields.import_dependent_fields_to_fyle', workspace_id)

if import_log and import_log.status != 'COMPLETE':
logger.error(f"Project Import is in {import_log.status} state in WORKSPACE_ID: {workspace_id} with error {str(import_log.error_log)}")

if chain.length() > 0:
chain.run()
14 changes: 14 additions & 0 deletions apps/mappings/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,20 @@ class Meta:
db_table = 'import_logs'
unique_together = ('workspace', 'attribute_type')

@classmethod
def create(self, attribute_type, workspace_id):
"""
Create import logs set to IN_PROGRESS
"""
import_log, _ = self.objects.update_or_create(
workspace_id=workspace_id,
attribute_type=attribute_type,
defaults={
'status': 'IN_PROGRESS'
}
)
return import_log


class Version(BaseModel):
"""
Expand Down
7 changes: 4 additions & 3 deletions apps/mappings/tasks.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
from apps.workspaces.models import Sage300Credential
from apps.sage300.utils import SageDesktopConnector
from apps.mappings.models import ImportLog


def sync_sage300_attributes(sage300_attribute_type: str, workspace_id: int):
def sync_sage300_attributes(sage300_attribute_type: str, workspace_id: int, import_log: ImportLog = None):
sage300_credentials: Sage300Credential = Sage300Credential.objects.get(workspace_id=workspace_id)

sage300_connection = SageDesktopConnector(
Expand All @@ -12,8 +13,8 @@ def sync_sage300_attributes(sage300_attribute_type: str, workspace_id: int):

sync_functions = {
'JOB': sage300_connection.sync_jobs,
'COST_CODE': sage300_connection.sync_cost_codes,
'COST_CATEGORY': sage300_connection.sync_cost_categories,
'COST_CODE': lambda:sage300_connection.sync_cost_codes(import_log),
'COST_CATEGORY': lambda:sage300_connection.sync_cost_categories(import_log),
Comment on lines +16 to +17
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tip

Codebase Verification

The functions sync_cost_codes and sync_cost_categories in apps/sage300/utils.py include the import_log parameter, but it is not being utilized within their implementations. This confirms the concern that the import_log parameter, while added to the function signatures, is not actively used in the function bodies.

  • sync_cost_codes function: Defined with _import_log = None, but _import_log is not used inside the function.
  • sync_cost_categories function: Defined with import_log = None, but import_log is not used inside the function.

To address this, ensure that the import_log parameter is utilized within these functions to track and log import operations appropriately.

Locations needing attention:

  • apps/sage300/utils.py: sync_cost_codes
  • apps/sage300/utils.py: sync_cost_categories
Analysis chain

Approved: Proper use of import_log in lambda functions.

The addition of import_log to the lambda functions for syncing cost codes and categories is a significant improvement for error tracking and operational logging. Please ensure that the sync_cost_codes and sync_cost_categories functions are implemented to handle this parameter correctly.

Scripts executed

The following scripts were executed for the analysis:


Script:

#!/bin/bash
# Description: Verify the correct usage of `import_log` in the sync functions.

# Test: Search for the function implementations. Expect: Proper handling of `import_log`.
ast-grep --lang python --pattern $'def sync_cost_codes($_, $_, $_) { $$$ }' apps/sage300/utils.py
ast-grep --lang python --pattern $'def sync_cost_categories($_, $_, $_) { $$$ }' apps/sage300/utils.py

Length of output: 1131



Script:

#!/bin/bash
# Description: Verify the correct usage of `import_log` in the sync functions.

# Corrected Test: Search for the function implementations. Expect: Proper handling of `import_log`.
ast-grep --lang python --pattern 'def sync_cost_codes(import_log, $$$):' 
ast-grep --lang python --pattern 'def sync_cost_categories(import_log, $$$):' 

Length of output: 154



Script:

#!/bin/bash
# Description: Verify the correct usage of `import_log` in the sync functions across the entire codebase.

# Search for the function definitions of sync_cost_codes and sync_cost_categories.
rg 'def sync_cost_codes' 
rg 'def sync_cost_categories'

Length of output: 208



Script:

#!/bin/bash
# Description: Extract the full implementations of the sync functions to verify the correct usage of `import_log`.

# Extract the implementation of sync_cost_codes.
ast-grep --lang python --pattern 'def sync_cost_codes(self, _import_log = None): { $$$ }' apps/sage300/utils.py

# Extract the implementation of sync_cost_categories.
ast-grep --lang python --pattern 'def sync_cost_categories(self, import_log = None): { $$$ }' apps/sage300/utils.py

Length of output: 231



Script:

#!/bin/bash
# Description: Extract the full content of apps/sage300/utils.py for manual inspection.

# Extract the full content of the file.
rg '' apps/sage300/utils.py

Length of output: 12752

'ACCOUNT': sage300_connection.sync_accounts,
'VENDOR': sage300_connection.sync_vendors,
'COMMITMENT': sage300_connection.sync_commitments,
Expand Down
96 changes: 68 additions & 28 deletions apps/sage300/dependent_fields.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@
from apps.fyle.models import DependentFieldSetting
from apps.sage300.models import CostCategory
from apps.fyle.helpers import connect_to_platform
from apps.mappings.models import ImportLog
from apps.mappings.exceptions import handle_import_exceptions

logger = logging.getLogger(__name__)
logger.level = logging.INFO
Expand Down Expand Up @@ -68,15 +70,20 @@ def create_dependent_custom_field_in_fyle(workspace_id: int, fyle_attribute_type
return platform.expense_custom_fields.post(expense_custom_field_payload)


def post_dependent_cost_code(dependent_field_setting: DependentFieldSetting, platform: PlatformConnector, filters: Dict, is_enabled: bool = True) -> List[str]:
@handle_import_exceptions
def post_dependent_cost_code(import_log: ImportLog, dependent_field_setting: DependentFieldSetting, platform: PlatformConnector, filters: Dict, is_enabled: bool = True) -> List[str]:
projects = CostCategory.objects.filter(**filters).values('job_name').annotate(cost_codes=ArrayAgg('cost_code_name', distinct=True))
projects_from_categories = [project['job_name'] for project in projects]
posted_cost_codes = []
total_batches = 0
processed_batches = 0
is_errored = False

existing_projects_in_fyle = ExpenseAttribute.objects.filter(
workspace_id=dependent_field_setting.workspace_id,
attribute_type='PROJECT',
value__in=projects_from_categories
value__in=projects_from_categories,
active=True
).values_list('value', flat=True)

for project in projects:
Expand All @@ -97,37 +104,60 @@ def post_dependent_cost_code(dependent_field_setting: DependentFieldSetting, pla
if payload:
sleep(0.2)
try:
total_batches += 1
Hrishabh17 marked this conversation as resolved.
Show resolved Hide resolved
platform.dependent_fields.bulk_post_dependent_expense_field_values(payload)
posted_cost_codes.extend(cost_code_names)
processed_batches += 1
ashwin1111 marked this conversation as resolved.
Show resolved Hide resolved
except Exception as exception:
is_errored = True
logger.error(f'Exception while posting dependent cost code | Error: {exception} | Payload: {payload}')
raise

return posted_cost_codes
import_log.status = 'COMPLETE'
import_log.error_log = []
import_log.total_batches_count = total_batches
import_log.processed_batches_count = processed_batches
import_log.save()

return posted_cost_codes, is_errored

def post_dependent_cost_type(dependent_field_setting: DependentFieldSetting, platform: PlatformConnector, filters: Dict):

@handle_import_exceptions
def post_dependent_cost_type(import_log: ImportLog, dependent_field_setting: DependentFieldSetting, platform: PlatformConnector, filters: Dict, posted_cost_codes: List = []):
cost_categories = CostCategory.objects.filter(is_imported=False, **filters).values('cost_code_name').annotate(cost_categories=ArrayAgg('name', distinct=True))
is_errored = False
total_batches = 0
processed_batches = 0

for category in cost_categories:
payload = [
{
'parent_expense_field_id': dependent_field_setting.cost_code_field_id,
'parent_expense_field_value': category['cost_code_name'],
'expense_field_id': dependent_field_setting.cost_category_field_id,
'expense_field_value': cost_type,
'is_enabled': True
} for cost_type in category['cost_categories']
]

if payload:
sleep(0.2)
try:
platform.dependent_fields.bulk_post_dependent_expense_field_values(payload)
CostCategory.objects.filter(cost_code_name=category['cost_code_name']).update(is_imported=True)
except Exception as exception:
logger.error(f'Exception while posting dependent cost type | Error: {exception} | Payload: {payload}')
raise
if category['cost_code_name'] in posted_cost_codes:
payload = [
{
'parent_expense_field_id': dependent_field_setting.cost_code_field_id,
'parent_expense_field_value': category['cost_code_name'],
'expense_field_id': dependent_field_setting.cost_category_field_id,
'expense_field_value': cost_type,
'is_enabled': True
} for cost_type in category['cost_categories']
]

if payload:
sleep(0.2)
try:
total_batches += 1
Hrishabh17 marked this conversation as resolved.
Show resolved Hide resolved
platform.dependent_fields.bulk_post_dependent_expense_field_values(payload)
CostCategory.objects.filter(cost_code_name=category['cost_code_name']).update(is_imported=True)
processed_batches += 1
except Exception as exception:
is_errored = True
logger.error(f'Exception while posting dependent cost type | Error: {exception} | Payload: {payload}')
Hrishabh17 marked this conversation as resolved.
Show resolved Hide resolved

import_log.status = 'COMPLETE'
Hrishabh17 marked this conversation as resolved.
Show resolved Hide resolved
import_log.error_log = []
import_log.total_batches_count = total_batches
import_log.processed_batches_count = processed_batches
import_log.save()

return is_errored


def post_dependent_expense_field_values(workspace_id: int, dependent_field_setting: DependentFieldSetting, platform: PlatformConnector = None):
Expand All @@ -141,12 +171,22 @@ def post_dependent_expense_field_values(workspace_id: int, dependent_field_setti
if dependent_field_setting.last_successful_import_at:
filters['updated_at__gte'] = dependent_field_setting.last_successful_import_at

posted_cost_types = post_dependent_cost_code(dependent_field_setting, platform, filters)
if posted_cost_types:
filters['cost_code_name__in'] = posted_cost_types
post_dependent_cost_type(dependent_field_setting, platform, filters)
cost_code_import_log = ImportLog.objects.filter(workspace_id=workspace_id, attribute_type='COST_CODE').first()
cost_category_import_log = ImportLog.objects.filter(workspace_id=workspace_id, attribute_type='COST_CATEGORY').first()

DependentFieldSetting.objects.filter(workspace_id=workspace_id).update(last_successful_import_at=datetime.now())
posted_cost_codes, is_cost_code_errored = post_dependent_cost_code(cost_code_import_log, dependent_field_setting, platform, filters)
if posted_cost_codes:
filters['cost_code_name__in'] = posted_cost_codes

if cost_code_import_log.status in ['FAILED', 'FATAL']:
cost_category_import_log.status = 'FAILED'
cost_category_import_log.error_log = {'message': 'Importing COST_CODE failed'}
cost_category_import_log.save()
return
else:
is_cost_type_errored = post_dependent_cost_type(cost_category_import_log, dependent_field_setting, platform, filters, posted_cost_codes)
if not is_cost_type_errored and not is_cost_code_errored:
DependentFieldSetting.objects.filter(workspace_id=workspace_id).update(last_successful_import_at=datetime.now())


def import_dependent_fields_to_fyle(workspace_id: str):
Expand Down
Loading
Loading