Meta Learning Paper Supplemental Code
Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: "META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed). This is not a dataset, but only python source code.
Complete Metadata
| @type | dcat:Dataset |
|---|---|
| accessLevel | public |
| accrualPeriodicity | irregular |
| bureauCode |
[
"006:55"
]
|
| contactPoint |
{
"fn": "Apostol Vassilev",
"hasEmail": "mailto:apostol.vassilev@nist.gov"
}
|
| description | Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: "META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed). This is not a dataset, but only python source code. |
| distribution |
[
{
"title": "Meta Learning Paper supplemental code",
"format": "Zip archive",
"accessURL": "https://github.com/usnistgov/NIST-AI-Meta-Learning-LLM",
"description": "Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: "META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed)."
},
{
"title": "Meta Learning Paper supplemental code",
"format": "zip archive of text files (pyhhon source code)",
"mediaType": "application/zip",
"description": "Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: "META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed).",
"downloadURL": "https://data.nist.gov/od/ds/mds2-3074/arxiv2023.zip"
}
]
|
| identifier | ark:/88434/mds2-3074 |
| issued | 2023-10-13 |
| keyword |
[
"Deep learning",
"Language Models",
"Meta learning",
"Natural language processing",
"Out of policy speech detection"
]
|
| landingPage | https://data.nist.gov/od/id/mds2-3074 |
| language |
[
"en"
]
|
| license | https://www.nist.gov/open/license |
| modified | 2023-09-11 00:00:00 |
| programCode |
[
"006:045"
]
|
| publisher |
{
"name": "National Institute of Standards and Technology",
"@type": "org:Organization"
}
|
| rights | N/A |
| theme |
[
"Information Technology:Computational science",
"Mathematics and Statistics"
]
|
| title | Meta Learning Paper Supplemental Code |