Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Import Logs for Deps Import #186

Closed
wants to merge 47 commits into from
Closed
Show file tree
Hide file tree
Changes from 36 commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
f6df1ec
Delete purchase_invoice, line_items on failed exports from hh2, type-…
Hrishabh17 May 21, 2024
372c628
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 21, 2024
d0ad337
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 24, 2024
7825b9e
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 27, 2024
b79c74c
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 27, 2024
97779c6
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 28, 2024
601e3cb
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 May 28, 2024
fb78432
Refactor deps schedule to run post Job import
Hrishabh17 Jun 3, 2024
57a4ed4
Fyle Card <> Vendor Mapping setup
Hrishabh17 Jun 3, 2024
19dfa7d
Added script to add mapping_settings
Hrishabh17 Jun 3, 2024
151e397
Fix post release script
Hrishabh17 Jun 4, 2024
ffbe1f5
Projects and Deps fields disable v1
Hrishabh17 Jun 4, 2024
372e5ab
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 4, 2024
737c79d
Remove dep setting trigger, add logger
Hrishabh17 Jun 4, 2024
da072df
Merge branch 'refactor-deps-import' into vendor-card-mapping
Hrishabh17 Jun 4, 2024
e2f4cef
Modified script, added additional test case
Hrishabh17 Jun 4, 2024
5bc9075
lint fix
Hrishabh17 Jun 4, 2024
c6a4697
Remove mock object
Hrishabh17 Jun 4, 2024
f667558
Merge branch 'refactor-deps-import' into vendor-card-mapping
Hrishabh17 Jun 4, 2024
1c2e3c1
Add details while logging
Hrishabh17 Jun 4, 2024
0e98a8a
Merge branch 'refactor-deps-import' into vendor-card-mapping
Hrishabh17 Jun 4, 2024
d23f0cd
modify post-release script
Hrishabh17 Jun 4, 2024
7dd6145
Merge branch 'vendor-card-mapping' into disable-sage-fields
Hrishabh17 Jun 4, 2024
f1a31f1
bump accounting-mapping version
Hrishabh17 Jun 4, 2024
505fabc
modify the variable_name, add conditional update
Hrishabh17 Jun 5, 2024
e85f6b2
Add example objects
Hrishabh17 Jun 5, 2024
b1f39ca
Added loggers
Hrishabh17 Jun 5, 2024
555a72c
Added test cases
Hrishabh17 Jun 5, 2024
675c37a
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 5, 2024
84ba2e9
Dependent Field optimizations
Hrishabh17 Jun 6, 2024
6dd28a9
fix failing test
Hrishabh17 Jun 6, 2024
aa96f62
Added Import Log for Deps
Hrishabh17 Jun 6, 2024
51e76e5
Fix post-release script
Hrishabh17 Jun 6, 2024
6cbcb7d
Merge branch 'dep-field-optimization' into dep-import-log
Hrishabh17 Jun 6, 2024
6c54fdc
fix failing test cases
Hrishabh17 Jun 6, 2024
1da8a7d
Set cost category import to fail on cost code fail
Hrishabh17 Jun 7, 2024
9aefa9f
Modify handle_import_exception for both class and func
Hrishabh17 Jun 7, 2024
173d31a
Modify test cases
Hrishabh17 Jun 7, 2024
5dac65e
fix ordering of saving import_log
Hrishabh17 Jun 7, 2024
ef99211
Move the import_log creation to ImportLog method
Hrishabh17 Jun 7, 2024
771c939
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 10, 2024
0d688ac
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 11, 2024
53d90b1
Merge branch 'master' of github.com:fylein/fyle-sage-desktop-api
Hrishabh17 Jun 11, 2024
766beb3
Merged master and bumped accounting-mappings version
Hrishabh17 Jun 11, 2024
8393600
Add logger in import
Hrishabh17 Jun 12, 2024
7a6d4fa
Merge branch 'master' into dep-import-log
Hrishabh17 Jun 12, 2024
30b7c50
Merge branch 'master' into dep-import-log
Hrishabh17 Jun 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion apps/accounting_exports/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ def _group_expenses(expenses: List[Expense], export_setting: ExportSetting, fund
reimbursable_expense_date = export_setting.reimbursable_expense_date

default_fields = ['employee_email', 'fund_source']
report_grouping_fields = ['report_id', 'claim_number']
report_grouping_fields = ['report_id', 'claim_number', 'corporate_card_id']
expense_grouping_fields = ['expense_id', 'expense_number']

# Define a mapping for fund sources and their associated group fields
Expand Down
3 changes: 2 additions & 1 deletion apps/fyle/serializers.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from apps.workspaces.models import Workspace, FyleCredential
from apps.fyle.models import ExpenseFilter, DependentFieldSetting
from apps.fyle.helpers import get_expense_fields

from apps.mappings.imports.queues import chain_import_fields_to_fyle

logger = logging.getLogger(__name__)
logger.level = logging.INFO
Expand All @@ -39,6 +39,7 @@ def create(self, validated_data):
platform = PlatformConnector(fyle_credentials)

if refresh:
chain_import_fields_to_fyle(workspace_id=workspace_id)
platform.import_fyle_dimensions()
workspace.source_synced_at = datetime.now()
workspace.save(update_fields=['source_synced_at'])
Expand Down
14 changes: 1 addition & 13 deletions apps/fyle/signals.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,13 @@
"""
import logging

from django.db.models.signals import post_save, pre_save
from django.db.models.signals import pre_save
from django.dispatch import receiver

from fyle_integrations_platform_connector import PlatformConnector
from apps.workspaces.models import FyleCredential
from apps.fyle.models import DependentFieldSetting
from apps.sage300.dependent_fields import create_dependent_custom_field_in_fyle
from apps.mappings.imports.schedules import schedule_or_delete_dependent_field_tasks


logger = logging.getLogger(__name__)
logger.level = logging.INFO
Expand Down Expand Up @@ -51,13 +49,3 @@ def run_pre_save_dependent_field_settings_triggers(sender, instance: DependentFi
parent_field_id=instance.cost_code_field_id,
)
instance.cost_category_field_id = cost_category['data']['id']


@receiver(post_save, sender=DependentFieldSetting)
def run_post_save_dependent_field_settings_triggers(sender, instance: DependentFieldSetting, **kwargs):
"""
:param sender: Sender Class
:param instance: Row instance of Sender Class
:return: None
"""
schedule_or_delete_dependent_field_tasks(instance.workspace_id)
10 changes: 7 additions & 3 deletions apps/mappings/exceptions.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,12 @@


def handle_import_exceptions(func):
def new_fn(expense_attribute_instance, *args):
import_log: ImportLog = args[0]
def new_fn(expense_attribute_instance, *args, **kwargs):
import_log = None
if isinstance(expense_attribute_instance, ImportLog):
import_log: ImportLog = expense_attribute_instance
else:
import_log: ImportLog = args[0]
workspace_id = import_log.workspace_id
attribute_type = import_log.attribute_type
error = {
Expand All @@ -28,7 +32,7 @@ def new_fn(expense_attribute_instance, *args):
'response': None
}
try:
return func(expense_attribute_instance, *args)
return func(expense_attribute_instance, *args, **kwargs)
except WrongParamsError as exception:
error['message'] = exception.message
error['response'] = exception.response
Expand Down
15 changes: 15 additions & 0 deletions apps/mappings/helpers.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
from apps.mappings.models import ImportLog


def create_deps_import_log(attribute_type, workspace_id):
Hrishabh17 marked this conversation as resolved.
Show resolved Hide resolved
"""
Create dependent import logs
"""
import_log, _ = ImportLog.objects.update_or_create(
workspace_id=workspace_id,
attribute_type=attribute_type,
defaults={
'status': 'IN_PROGRESS'
}
)
return import_log
10 changes: 10 additions & 0 deletions apps/mappings/imports/queues.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
from django_q.tasks import Chain
from fyle_accounting_mappings.models import MappingSetting
from apps.workspaces.models import ImportSetting
from apps.fyle.models import DependentFieldSetting


def chain_import_fields_to_fyle(workspace_id):
Expand All @@ -11,6 +12,9 @@ def chain_import_fields_to_fyle(workspace_id):
mapping_settings = MappingSetting.objects.filter(workspace_id=workspace_id, import_to_fyle=True)
custom_field_mapping_settings = MappingSetting.objects.filter(workspace_id=workspace_id, is_custom=True, import_to_fyle=True)
import_settings = ImportSetting.objects.get(workspace_id=workspace_id)
dependent_field_settings = DependentFieldSetting.objects.filter(workspace_id=workspace_id, is_import_enabled=True).first()
project_mapping = MappingSetting.objects.filter(workspace_id=workspace_id, source_field='PROJECT', import_to_fyle=True).first()

chain = Chain()

if import_settings.import_categories:
Expand Down Expand Up @@ -47,5 +51,11 @@ def chain_import_fields_to_fyle(workspace_id):
True
)

if project_mapping and dependent_field_settings:
chain.append(
'apps.mappings.imports.tasks.auto_import_and_map_fyle_fields',
workspace_id
)

if chain.length() > 0:
chain.run()
35 changes: 0 additions & 35 deletions apps/mappings/imports/schedules.py
Original file line number Diff line number Diff line change
@@ -1,41 +1,9 @@
from datetime import datetime
from django_q.models import Schedule
from fyle_accounting_mappings.models import MappingSetting

from apps.fyle.models import DependentFieldSetting
from apps.workspaces.models import ImportSetting


def schedule_or_delete_dependent_field_tasks(workspace_id: int):
"""
:param configuration: Workspace Configuration Instance
:return: None
"""
project_mapping = MappingSetting.objects.filter(
source_field='PROJECT',
workspace_id=workspace_id,
import_to_fyle=True
).first()
dependent_fields = DependentFieldSetting.objects.filter(workspace_id=workspace_id, is_import_enabled=True).first()

if project_mapping and dependent_fields:
start_datetime = datetime.now()
Schedule.objects.update_or_create(
func='apps.mappings.imports.tasks.auto_import_and_map_fyle_fields',
args='{}'.format(workspace_id),
defaults={
'schedule_type': Schedule.MINUTES,
'minutes': 24 * 60,
'next_run': start_datetime
}
)
elif not (project_mapping and dependent_fields):
Schedule.objects.filter(
func='apps.mappings.imports.tasks.auto_import_and_map_fyle_fields',
args='{}'.format(workspace_id)
).delete()


def schedule_or_delete_fyle_import_tasks(import_settings: ImportSetting, mapping_setting_instance: MappingSetting = None):
"""
Schedule or delete Fyle import tasks based on the import settingss.
Expand Down Expand Up @@ -78,6 +46,3 @@ def schedule_or_delete_fyle_import_tasks(import_settings: ImportSetting, mapping
func='apps.mappings.imports.queues.chain_import_fields_to_fyle',
args='{}'.format(import_settings.workspace_id)
).delete()

# Schedule or delete dependent field tasks
schedule_or_delete_dependent_field_tasks(import_settings.workspace_id)
30 changes: 18 additions & 12 deletions apps/mappings/imports/tasks.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,17 @@
import logging
from django_q.tasks import Chain

from fyle_accounting_mappings.models import MappingSetting

from apps.mappings.models import ImportLog
from apps.mappings.imports.modules.categories import Category
from apps.mappings.imports.modules.projects import Project
from apps.mappings.imports.modules.cost_centers import CostCenter
from apps.mappings.imports.modules.merchants import Merchant
from apps.mappings.imports.modules.expense_custom_fields import ExpenseCustomField
from apps.fyle.models import DependentFieldSetting
from apps.mappings.helpers import create_deps_import_log


logger = logging.getLogger(__name__)
logger.level = logging.INFO

SOURCE_FIELD_CLASS_MAP = {
'CATEGORY': Category,
Expand Down Expand Up @@ -41,20 +44,23 @@ def auto_import_and_map_fyle_fields(workspace_id):
"""
Auto import and map fyle fields
"""
project_mapping = MappingSetting.objects.filter(
source_field='PROJECT',
import_log = ImportLog.objects.filter(
workspace_id=workspace_id,
import_to_fyle=True
attribute_type = 'PROJECT'
).first()
dependent_fields = DependentFieldSetting.objects.filter(workspace_id=workspace_id, is_import_enabled=True).first()

chain = Chain()

if project_mapping and dependent_fields:
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'JOB', workspace_id)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'COST_CODE', workspace_id)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'COST_CATEGORY', workspace_id)
chain.append('apps.sage300.dependent_fields.import_dependent_fields_to_fyle', workspace_id)
cost_code_import_log = create_deps_import_log('COST_CODE', workspace_id)
cost_category_import_log = create_deps_import_log('COST_CATEGORY', workspace_id)

chain.append('apps.mappings.tasks.sync_sage300_attributes', 'JOB', workspace_id)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'COST_CODE', workspace_id, cost_code_import_log)
chain.append('apps.mappings.tasks.sync_sage300_attributes', 'COST_CATEGORY', workspace_id, cost_category_import_log)
chain.append('apps.sage300.dependent_fields.import_dependent_fields_to_fyle', workspace_id)

if import_log and import_log.status != 'COMPLETE':
logger.error(f"Project Import is in {import_log.status} state in WORKSPACE_ID: {workspace_id} with error {str(import_log.error_log)}")

if chain.length() > 0:
chain.run()
7 changes: 4 additions & 3 deletions apps/mappings/tasks.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
from apps.workspaces.models import Sage300Credential
from apps.sage300.utils import SageDesktopConnector
from apps.mappings.models import ImportLog


def sync_sage300_attributes(sage300_attribute_type: str, workspace_id: int):
def sync_sage300_attributes(sage300_attribute_type: str, workspace_id: int, import_log: ImportLog = None):
sage300_credentials: Sage300Credential = Sage300Credential.objects.get(workspace_id=workspace_id)

sage300_connection = SageDesktopConnector(
Expand All @@ -12,8 +13,8 @@ def sync_sage300_attributes(sage300_attribute_type: str, workspace_id: int):

sync_functions = {
'JOB': sage300_connection.sync_jobs,
'COST_CODE': sage300_connection.sync_cost_codes,
'COST_CATEGORY': sage300_connection.sync_cost_categories,
'COST_CODE': lambda:sage300_connection.sync_cost_codes(import_log),
'COST_CATEGORY': lambda:sage300_connection.sync_cost_categories(import_log),
'ACCOUNT': sage300_connection.sync_accounts,
'VENDOR': sage300_connection.sync_vendors,
'COMMITMENT': sage300_connection.sync_commitments,
Expand Down
54 changes: 43 additions & 11 deletions apps/sage300/dependent_fields.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@
from apps.fyle.models import DependentFieldSetting
from apps.sage300.models import CostCategory
from apps.fyle.helpers import connect_to_platform
from apps.mappings.models import ImportLog
from apps.mappings.exceptions import handle_import_exceptions

logger = logging.getLogger(__name__)
logger.level = logging.INFO
Expand Down Expand Up @@ -68,8 +70,8 @@ def create_dependent_custom_field_in_fyle(workspace_id: int, fyle_attribute_type
return platform.expense_custom_fields.post(expense_custom_field_payload)


def post_dependent_cost_code(dependent_field_setting: DependentFieldSetting, platform: PlatformConnector, filters: Dict) -> List[str]:

@handle_import_exceptions
def post_dependent_cost_code(import_log: ImportLog, dependent_field_setting: DependentFieldSetting, platform: PlatformConnector, filters: Dict, is_enabled: bool = True) -> List[str]:
projects = CostCategory.objects.filter(**filters).values('job_name').annotate(cost_codes=ArrayAgg('cost_code_name', distinct=True))
projects_from_categories = [project['job_name'] for project in projects]
posted_cost_codes = []
Expand All @@ -91,20 +93,28 @@ def post_dependent_cost_code(dependent_field_setting: DependentFieldSetting, pla
'parent_expense_field_value': project['job_name'],
'expense_field_id': dependent_field_setting.cost_code_field_id,
'expense_field_value': cost_code,
'is_enabled': True
'is_enabled': is_enabled
})
cost_code_names.append(cost_code)

if payload:
sleep(0.2)
platform.dependent_fields.bulk_post_dependent_expense_field_values(payload)
posted_cost_codes.extend(cost_code_names)

try:
platform.dependent_fields.bulk_post_dependent_expense_field_values(payload)
posted_cost_codes.extend(cost_code_names)
except Exception as exception:
logger.error(f'Exception while posting dependent cost code | Error: {exception} | Payload: {payload}')
raise

import_log.status = 'COMPLETE'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we also store batches count while posting?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

store batch count in import log?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes

import_log.error_log = []
import_log.save()
return posted_cost_codes


def post_dependent_cost_type(dependent_field_setting: DependentFieldSetting, platform: PlatformConnector, filters: Dict):
cost_categories = CostCategory.objects.filter(**filters).values('cost_code_name').annotate(cost_categories=ArrayAgg('name', distinct=True))
@handle_import_exceptions
def post_dependent_cost_type(import_log: ImportLog, dependent_field_setting: DependentFieldSetting, platform: PlatformConnector, filters: Dict):
cost_categories = CostCategory.objects.filter(is_imported=False, **filters).values('cost_code_name').annotate(cost_categories=ArrayAgg('name', distinct=True))

for category in cost_categories:
payload = [
Expand All @@ -119,7 +129,16 @@ def post_dependent_cost_type(dependent_field_setting: DependentFieldSetting, pla

if payload:
sleep(0.2)
platform.dependent_fields.bulk_post_dependent_expense_field_values(payload)
try:
platform.dependent_fields.bulk_post_dependent_expense_field_values(payload)
CostCategory.objects.filter(cost_code_name=category['cost_code_name']).update(is_imported=True)
except Exception as exception:
logger.error(f'Exception while posting dependent cost type | Error: {exception} | Payload: {payload}')
raise

import_log.status = 'COMPLETE'
import_log.error_log = []
import_log.save()


def post_dependent_expense_field_values(workspace_id: int, dependent_field_setting: DependentFieldSetting, platform: PlatformConnector = None):
Expand All @@ -133,10 +152,23 @@ def post_dependent_expense_field_values(workspace_id: int, dependent_field_setti
if dependent_field_setting.last_successful_import_at:
filters['updated_at__gte'] = dependent_field_setting.last_successful_import_at

posted_cost_types = post_dependent_cost_code(dependent_field_setting, platform, filters)
cost_code_import_log = ImportLog.objects.filter(workspace_id=workspace_id, attribute_type='COST_CODE').first()
cost_category_import_log = ImportLog.objects.filter(workspace_id=workspace_id, attribute_type='COST_CATEGORY').first()

posted_cost_types = post_dependent_cost_code(cost_code_import_log, dependent_field_setting, platform, filters)
if posted_cost_types:
filters['cost_code_name__in'] = posted_cost_types
post_dependent_cost_type(dependent_field_setting, platform, filters)
post_dependent_cost_type(cost_category_import_log, dependent_field_setting, platform, filters)

if cost_code_import_log.status in ['FAILED', 'FATAL']:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wouldn't there be exception incase of errors?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both post_cost_code and post_cost_type method are decoarated with handle_import_exception

  1. If post of cost code fails, we handle in exception handler decorator and set the import log status to Failed.
  2. When the call returns we check the status, if its in Complete, we call the post of cost type (here if error occurs the status will be Failed/Fatal). Otherwise we mark complete.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also fixed the call ordering, it was wrong earlier. can check code again for this.

cost_category_import_log.status = 'FAILED'
cost_category_import_log.error_log = {'message': 'Importing COST_CODE failed'}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we're storing all details in the exception, why are we overriding?

Screenshot 2024-06-13 at 10 59 17 AM

cost_category_import_log.save()
return
else:
cost_category_import_log.status = 'COMPLETE'
cost_category_import_log.error_log = []
cost_category_import_log.save()

DependentFieldSetting.objects.filter(workspace_id=workspace_id).update(last_successful_import_at=datetime.now())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should bring this inside else block, else we might mark timestamp even for failed runs


Expand Down
Loading
Loading