Transformer-based Machine translation models are increasingly being deployed for multilingual applications and have demonstrated impressive translation capabilities between several low and high-resource languages. However, most of these translation systems prioritise surface-level literal translation often neglecting the intended purpose or skopos of the communication. Through the lens of Translation Studies, I will briefly discuss the gaps between human and machine translation, and the opportunities with large language models to address them. I will talk about my work from CoNLL 2024 ``Translating Across Cultures: LLMs for Intralingual Cultural Adaptation''. `Cultural Adaptation' involves modifying source culture references to suit the target culture. We curated a corpus of dialogues, annotating culture-specific elements across various categories and levels of foreignness, and defined the goals and aspects of cultural adaptation employing both edit-level analysis and broader, more contextual dialogue-level analysis for evaluation. We assess the performance of several open-source LLMs for cultural adaptation. This application also serves as a way to analyze cross-cultural knowledge of LLMs while connecting related concepts across different cultures. We also analyze possible issues with automatic adaptation and future opportunities in automatic translation.