Заключительное занятие цикла. Было очень познавательно! #шугнанский
https://youtu.be/Ewng-4_lsGs?si=-ncl2Dr4D3KjvIHu
https://youtu.be/Ewng-4_lsGs?si=-ncl2Dr4D3KjvIHu
YouTube
О. И. Беляев: Сочинительные и подчинительные конструкции в шугнанском и бартангском [Занятие 8]
Занятие 8. (4 апреля 2024 г.)
Преподаватель: Олег Игоревич Беляев
С февраля 2024 года командой памирской экспедиции (НИУ ВШЭ / ИЯз РАН / Кембриджский университет) проводится серия вводных занятий, посвященных шугнанскому и близким к нему языкам.
Организационная…
Преподаватель: Олег Игоревич Беляев
С февраля 2024 года командой памирской экспедиции (НИУ ВШЭ / ИЯз РАН / Кембриджский университет) проводится серия вводных занятий, посвященных шугнанскому и близким к нему языкам.
Организационная…
✍2 2
Forwarded from Арт с котом
Лоренцо Лотто подсмотрел композицию своего «Благовещения» у Тициана и Дирка Баутса, но выкрутил эмоции на максимум и сдобрил сюжет перепуганным котом: это Дева Мария вынуждена безропотно принять весть, принесенную ангелом, и свою участь, а котик, похоже, не намерен во всем этом участвовать.
1. Лоренцо Лотто. Благовещение. Ок. 1534, Вилла Коллоредо Мельс, Реканати
2. Дирк Баутс. Благовещение. 1455. Музей Гетти, Лос-Анджелес
5. Тициан. Благовещение. 1520. Собор святого Петра, Тревизо
1. Лоренцо Лотто. Благовещение. Ок. 1534, Вилла Коллоредо Мельс, Реканати
2. Дирк Баутс. Благовещение. 1455. Музей Гетти, Лос-Анджелес
5. Тициан. Благовещение. 1520. Собор святого Петра, Тревизо
Forwarded from wikipedia images
The largest mantra innoscription in the world is located on Dogee Mountain in Kyzyl, Russia.
https://en.wikipedia.org/wiki/Om_mani_padme_hum
https://en.wikipedia.org/wiki/Om_mani_padme_hum
❤1
Forwarded from Antibarbari HSE (Кирилл Прокопов)
Полезные ресурсы для тех, кто занимается латынью по серии книг
Lingua latina per se illustrata: латинско-латинские словарики к Familia Romana (автор: Iosephus Clomparens) и Roma Aeterna (автор: Juan Pablo Fernández del Río).✍1
Forwarded from ⚡️Linguistic alerts/лингвистические оповещения
#talk
15th April
Automatic interlinear glossing and its use in language documentation efforts
Time: 9am PDT / 12pm EDT / 4pm UTC
Zoom link:
https://cmu.zoom.us/j/98890626672?pwd=aERLMG9uQ0lhQlNMQ1RZUFBvKzlYdz09
Meeting ID: 988 9062 6672 Passcode: 632741
Bio: Michael Ginn is a Ph.D. student in the Linguistics department and Institute of Cognitive Science at the University of Colorado. His research focuses on computational morphology, low-resource methods for NLP, and applying NLP to aid in language documentation projects.
Abstract: Abstract Language documentation projects often include the creation of annotated corpora in a format such as interlinear glossed text (IGT), which can be beneficial to language preservation and revitalization, linguistic analyses, and developing language technologies. However, creating extensive IGT corpora is time-consuming and painstaking, and prior research has suggested that automated methods for generating IGT can greatly reduce annotator effort (Palmer et al., 2009; Palmer et al., 2010). The power of large language models has enabled a new wave of development in IGT generation systems. We report findings from the 2023 SIGMORPHON shared task on interlinear glossing (Ginn et al., 2023), where a number of neural systems were used for automated glossing across a variety of languages. In Ginn and Palmer (2023), we analyze the generalization ability of neural IGT systems to out-of-domain texts, and suggest strategies to design more robust systems. Finally, we discuss forthcoming work in large-scale transfer learning for IGT glossing, in which we compile a massive multilingual corpus of IGT and pretrain foundation models for use in future documentation projects.
Спасибо Лене К!
15th April
Automatic interlinear glossing and its use in language documentation efforts
Time: 9am PDT / 12pm EDT / 4pm UTC
Zoom link:
https://cmu.zoom.us/j/98890626672?pwd=aERLMG9uQ0lhQlNMQ1RZUFBvKzlYdz09
Meeting ID: 988 9062 6672 Passcode: 632741
Bio: Michael Ginn is a Ph.D. student in the Linguistics department and Institute of Cognitive Science at the University of Colorado. His research focuses on computational morphology, low-resource methods for NLP, and applying NLP to aid in language documentation projects.
Abstract: Abstract Language documentation projects often include the creation of annotated corpora in a format such as interlinear glossed text (IGT), which can be beneficial to language preservation and revitalization, linguistic analyses, and developing language technologies. However, creating extensive IGT corpora is time-consuming and painstaking, and prior research has suggested that automated methods for generating IGT can greatly reduce annotator effort (Palmer et al., 2009; Palmer et al., 2010). The power of large language models has enabled a new wave of development in IGT generation systems. We report findings from the 2023 SIGMORPHON shared task on interlinear glossing (Ginn et al., 2023), where a number of neural systems were used for automated glossing across a variety of languages. In Ginn and Palmer (2023), we analyze the generalization ability of neural IGT systems to out-of-domain texts, and suggest strategies to design more robust systems. Finally, we discuss forthcoming work in large-scale transfer learning for IGT glossing, in which we compile a massive multilingual corpus of IGT and pretrain foundation models for use in future documentation projects.
Спасибо Лене К!
Zoom Video
Join our Cloud HD Video Meeting
Zoom is the leader in modern enterprise video communications, with an easy, reliable cloud platform for video and audio conferencing, chat, and webinars across mobile, desktop, and room systems. Zoom Rooms is the original software-based conference room solution…
✍1