Index Link

  • RootNode

Item

{"_buckets": {"deposit": "844bcaaa-e06e-4fdd-aea5-c0d4023876df"}, "_deposit": {"id": "4908", "owners": [], "pid": {"revision_id": 0, "type": "recid", "value": "4908"}, "status": "published"}, "_oai": {"id": "oai:meral.edu.mm:recid/4908", "sets": ["1597824273898", "user-ucsy"]}, "communities": ["ucsy"], "item_1583103067471": {"attribute_name": "Title", "attribute_value_mlt": [{"subitem_1551255647225": "Duplicate Records Elimination in Bibliographical Dataset using Priority Queue Algorithm with Smith-Waterman Algorithm", "subitem_1551255648112": "en"}]}, "item_1583103085720": {"attribute_name": "Description", "attribute_value_mlt": [{"interim": "Often, in the real world, entities have two or more representations in databases. Duplicate records do not share a common key and / or they contain errors that make duplicate matching a difficult task. A major problem that arises from integrating different databases is the existence of duplicates. Data cleaning is the process for identifying two or more records within the database, which represent the same real world object (duplicates), so that a unique representation for each object is adopted. This system addresses the data cleaning problem of detecting duplicate records that are approximate duplicates, but not exact duplicates. It uses Priority Queue algorithm with Smith Waterman algorithm for computing minimum edit-distance similarity values to recognize pairs of approximately duplicates and then eliminate the detected duplicate records. And, we also determine the performance evaluation with the lowest FP %( false positive percentage) and FN %( false negative percentage) as the best result."}]}, "item_1583103108160": {"attribute_name": "Keywords", "attribute_value": []}, "item_1583103120197": {"attribute_name": "Files", "attribute_type": "file", "attribute_value_mlt": [{"accessrole": "open_access", "date": [{"dateType": "Available", "dateValue": "2019-07-12"}], "displaytype": "preview", "download_preview_message": "", "file_order": 0, "filename": "psc2010paper (3).pdf", "filesize": [{"value": "545 Kb"}], "format": "application/pdf", "future_date_message": "", "is_thumbnail": false, "licensetype": "license_free", "mimetype": "application/pdf", "size": 545000.0, "url": {"url": "https://meral.edu.mm/record/4908/files/psc2010paper (3).pdf"}, "version_id": "20aab5b5-f0e3-4b97-9c73-29fbf47d8753"}]}, "item_1583103131163": {"attribute_name": "Journal articles", "attribute_value_mlt": [{"subitem_issue": "", "subitem_journal_title": "Fifth Local Conference on Parallel and Soft Computing", "subitem_pages": "", "subitem_volume": ""}]}, "item_1583103147082": {"attribute_name": "Conference papers", "attribute_value_mlt": [{"subitem_acronym": "", "subitem_c_date": "", "subitem_conference_title": "", "subitem_part": "", "subitem_place": "", "subitem_session": "", "subitem_website": ""}]}, "item_1583103211336": {"attribute_name": "Books/reports/chapters", "attribute_value_mlt": [{"subitem_book_title": "", "subitem_isbn": "", "subitem_pages": "", "subitem_place": "", "subitem_publisher": ""}]}, "item_1583103233624": {"attribute_name": "Thesis/dissertations", "attribute_value_mlt": [{"subitem_awarding_university": "", "subitem_supervisor(s)": [{"subitem_supervisor": ""}]}]}, "item_1583105942107": {"attribute_name": "Authors", "attribute_value_mlt": [{"subitem_authors": [{"subitem_authors_fullname": "Thaung, Su Mon"}, {"subitem_authors_fullname": "Htike, Thin Thin"}]}]}, "item_1583108359239": {"attribute_name": "Upload type", "attribute_value_mlt": [{"interim": "Publication"}]}, "item_1583108428133": {"attribute_name": "Publication type", "attribute_value_mlt": [{"interim": "Article"}]}, "item_1583159729339": {"attribute_name": "Publication date", "attribute_value": "2010-12-16"}, "item_1583159847033": {"attribute_name": "Identifier", "attribute_value": "http://onlineresource.ucsy.edu.mm/handle/123456789/811"}, "item_title": "Duplicate Records Elimination in Bibliographical Dataset using Priority Queue Algorithm with Smith-Waterman Algorithm", "item_type_id": "21", "owner": "1", "path": ["1597824273898"], "permalink_uri": "http://hdl.handle.net/20.500.12678/0000004908", "pubdate": {"attribute_name": "Deposited date", "attribute_value": "2019-07-12"}, "publish_date": "2019-07-12", "publish_status": "0", "recid": "4908", "relation": {}, "relation_version_is_last": true, "title": ["Duplicate Records Elimination in Bibliographical Dataset using Priority Queue Algorithm with Smith-Waterman Algorithm"], "weko_shared_id": -1}

Duplicate Records Elimination in Bibliographical Dataset using Priority Queue Algorithm with Smith-Waterman Algorithm

http://hdl.handle.net/20.500.12678/0000004908
0ab441d8-a22d-44b9-abc8-4d17029b4d98
844bcaaa-e06e-4fdd-aea5-c0d4023876df
None
Name / File License Actions
psc2010paper psc2010paper (3).pdf (545 Kb)
0
0
views
downloads
Views Downloads

Export

OAI-PMH
  • OAI-PMH DublinCore
Other Formats