ONLY CVs IN ENGLISH WILL BE CONSIDERED IN THE RECRUITMENT PROCESS
Deutsche Telekom Pan-Net . designs, operates and steers the joint pan-European network ("Pan-Net") of Deutsche Telekom. The company is a member of the Deutsche Telekom Group with head quarter in Bratislava and locations in Europe. Deutsche Telekom PanNet is building the capabilities for Deutsche Telekom’s future centralized production platform across Europe.
Service Delivery consists of several service centers which develop and operate voice, data, messaging and ICT services for Deutsche Telekom European markets. Services are produced based on modular and reusable building blocks with standardized interfaces towards national companies of Deutsche Telekom. an-Net aims at a lean production model with efficient processes and a close integration of development and operations.
The PanNet is looking for candidates with a start-up mentality that are equally able to support the initial setup with the necessary flexibility and to define a sustainable target organization based on the key production principles of Pan Net.
Tasks / What will work on:
- provide data modeling, mining, pattern analysis, data visualization and machine learning solutions to address customer needs
- Work in the field of Artificial Intelligence and NLP
- background in DL, ML algorithms and model development
- define and build deep neural network architectures
- training and validation of deep neural networks
- artificial intelligence platform development
- technical project planning
- support and consulting in the area of artificial intelligence and deep learning
- perform project estimations
- define the implementation of the solutions for the problems at hand
- define verification and validation strategies
- communicate project output in terms of customer value, business objectives, and product opportunity
- attend technology conferences to stay current on industry trends, challenges, and tools
- promote data science methods and processes across functions
- develop code in partnership with a product team that makes your data science solutions a reality for the customers
Professional skills:
Experience listed below would be obtained through a combination of your school work/research and/or relevant previous job and/or internship experiences.
- Degree in Computer Science, or Statistics / Mathematics or equivalent experience.
- Familiar with data science techniques: time series, clustering, segmentation, k-means, hierarchical clustering, tree-based methods, neural networks, etc.
- Familiar with natural language processing techniques: symbolic methods, word embeddings, topic modelling etc.
- Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc.
- Experience with common data science toolkits, such as R, Python, Weka, NumPy, MatLab, etc. Excellence in at least one of these is highly desirable
- Experience with data visualization tools, such as , GGplot, is a plus
- Good applied statistics skills, such as distributions, statistical testing, regression, etc.
- Good knowledge of statistical analysis, theory of probabilities, design of experiments and machine learning hands-on experience with NLP, mining of structured, semi-structured, and unstructured data
- Intuitive understanding of machine learning algorithms, supervised and unsupervised modeling techniques
- Data-oriented personality
Social skills:
The ideal candidate should exhibit the following behavioral traits:
- Problem-solving skills
- Ability to multitask
- Good communication skills with the ability to develop strong client relationships
- Ability to work in a dynamic and team-oriented environment
Specific skills (Methods, trends, language, knowledge):
Service Delivery – Data Scientist
Experience in the following areas
- knowledge of databases and query languages (SQL, PL/SQL) ,MySql or Hive, Pig
- Experience with NoSQL databases, such as MongoDB, Cassandra, HBase
- Good command of programming language and software environment for statistical analysis, graphics representation and reporting . R, Python;
- At least basic knowledge of any visualization tools like Tableau, Superset, Business Objects, etc
- experience with code versioning systems
- experience with implementing solutions and platform changes/updates.
- experience in automation and testing via scripting and programming. Knowledge of Docker/Ansible/Jenkins/GitLab is a plus
- ability to understand requirements and derive unit test specifications accordingly
- ability to analyze and understand given code, experience in code reviews
- great understanding of agile methodologies and continuous integration.
- Motivation and flexibility- working well in a fast paced, collaborative team environment.
- Passion for innovation, efficiency and quality.
- Working with remote and global teams
By applying for this job you accept the DT privacy statement:
To process your online application we collect, process and use your personal data. We will treat your data as strictly confidential in accordance statutory provisions.
By submitting your application, you consent to your data being processed electronically, including by third parties. Data is only passed on to HR service providers that have been carefully selected by Deutsche Telekom AG.
For detailed information read the local data protection when applying for a job position at Deutsche Telekom Group.