Page 758«..1020..757758759760..770780..»

New BVA initiative to make vet practices a better place to work – VetSurgeon News

Posted: September 21, 2020 at 11:51 pm


The Good Veterinary Workplaces Voluntary Code sets out various criteria for what makes a good workplace, based on a new evidence-based BVA policy position.

The code is accompanied by a workbook which veterinary teams can work through together and consider how theymight meet a range of criteria.Theycan alsodownload, sign and display a Voluntary Code poster signallingtheircommitment to working towards being a good veterinary workplace.

Teams will be asked to assess what they already do well in areas including health and wellbeing, diversity and equality, workload and flexibility, and providing opportunities for personal and career development, as well as identifying areas for improvement and any HR and management processes that need to be put in place to achieve a positive workplace culture.

At the same time, the BVA has also launched its Good Veterinary Workplaces policy position, a paper which offers 64 recommendations for employers and staff on how to offer a fair and rewarding work environment where everyone feels valued.

The BVA says it decided to develop the Good Veterinary Workplaces policy off the back of an extensive body of work looking at workforce issues in the profession, including recruitment and retention challenges, a lack of diversity across the workforce, and general high levels of stress and burn-out in veterinary teams.

The joint BVA/RCVS-led Vet Futures project identified the need to explore the work-related challenges facing vets and take action to create a sustainable and thriving workforce that can maximise its potential.

Gudrun Ravetz, Chair of the Good Workplace Working Group, said: Im absolutely delighted to see the launch of our valuable and comprehensive policy, which sets out a vision of the good veterinary workplaces that we should all be striving to create across the profession. This vision has been shaped by valuable contributions from across the veterinary community, and its also been really useful to draw on good practice in the wider world of work.

Each and every one of us deserves to work in a setting where we feel valued, supported and fairly rewarded for the contribution we make, but sadly this isnt the reality for all veterinary professionals. By setting out the steps that all veterinary workplaces can take to offer a more welcoming and inclusive environment, with measures in place to help them address issues and continue to improve, we hope to see more workplaces where staff can thrive and enjoy a fulfilling career.

Daniella Dos Santos, BVA Senior Vice President, said: Its time for us all to take action to create a culture shift in veterinary workplaces. That means taking positive steps so that diversity and inclusion is championed at all levels, all team members have access to personal and professional development opportunities, and there is recognition that prioritising staff wellbeing is good for businesses.

In creating the Voluntary Code and workbook, weve purposefully made this something that isnt driven from the top down but is instead something that everyone in the team can feel empowered to feed into and sign up to. This is a golden opportunity for our profession to take ownership of our workplaces, improve conditions, and make sure that we have positive working environments in which we can all take pride.

Whilst you're here, take a moment to see our latest job opportunities for vets.

Read more:
New BVA initiative to make vet practices a better place to work - VetSurgeon News

Written by admin |

September 21st, 2020 at 11:51 pm

Processing grief through dance – Portland Press Herald – Press Herald

Posted: at 11:51 pm


Grief is all around us these days. With nearly 200,000 Americans dead from a perniciously contagious virus, the number of friends and loved ones in mourning is incalculable. For those of us still walking, grieving is even harder when that health crisis suspends our daily lives, making the normal processing of loss and sorrow even more difficult.

Now imagine youre in prison.

Apart from the daily pain of incarceration, where fear and guilt are the unwelcome daily companions to the base-level disquiet all Americans are living through. Even more so than the rest of us, the incarcerated have little chance for escape. (Figuratively speaking, of course.) But, for a Maine dancer and the moviemaker who filmed her grief-themed dance performance, escape has come from a most unlikely place.

We dont deal well with grief in our culture and society, and its even worse in a prison system where theres more grief and even less access to resources, said Portlands Lindsey Bourassa, whose multimedia dance performance El Lobo y La Paloma (The Wolf and the Dove) was filmed in 2017-18 by producer/directors David Camlin and Scott Sutherland and recently picked up by Edovo, a company dedicated to bringing educational, arts, and communication projects to people in the nations prisons and jails. Its also available for rent or purchase through Vimeo for those on the outside.

Inspired by a poem written by Bourassas father about the long-ago death of her mother, El Lobo y La Poloma sees Bourassa and fellow flamenco dancer Megan Keogh physically manifesting the grieving process over projected paintings by Khosro Berahmandi and a striking performance by singer Talal Alzefiri and musician Thomas Kovacevic. Incorporating narration of her fathers poem and her own on the subject of death and loss, the film is simultaneously passionate, theatrical, and for those of us not versed in the art of dance mysterious. Even opaque.

Its resonated with incarcerated viewers to an extent that even Bourassa and Camlin have found surprising. As an artist, youre always looking for ways to bring your work to different populations, to make it accessible there, says Bourassa. Flamenco is a really great art form for telling stories, because, while its very technical, its goal is emotional. Sometimes I feel it touches people in a way that maybe other forms that are less outwardly emotional cant.

While El Lobo y La Poloma wasnt conceived with incarcerated audiences in mind, both Camlin and Bourassa tout Edovos approach to bringing uplifting, thought-provoking art behind prison walls where such experiences are perpetually scarce. Edovos whole concept is education-based learning experience, explains Camlin, who splits his professional life between Maine and Michigan. From muffin recipes to GED tracks to skill training, its all about personal development. Our goal was to develop a learning component to go along with the film.

Camlin notes that he and Bourassa went through a learning process, as well. We initially thought about bringing this to a group setting, like we have at public screenings of the film, where Ive witnessed some of the best conversations Ive ever seen. But for a lot of incarcerated people, there isnt a person they can trust enough to show their emotions. So we tailored it to an individual experience.

Introduced to Edovo by the PMAs Jon Courtney (whose work promoting prison education and arts weve profiled at Indie Film), Bourassa and Camlin alongside Hospice of Southern Maine counselor Carol Schonberg have created an immersive experience about processing grief that has clearly struck a chord. They started to receive positive feedback from incarcerated viewers within days of the films Edovo release, and the pair say the anonymous responses have been sometimes overwhelming. (One unedited response reads, I didnt take my losses well and this shows me a way to turn pain into art, while another states plainly, Makes me wanna fix my ways so im remebered by something impressive.)

For a work so personal, the success of El Lobo y La Paloma in touching people in need has been humbling for Bourassa. Noting that her own process has been one of realizing that joy and creativity are as much a part of grieving as sadness, Bourassa says, The feedback is so helpful in asking, What does this work do? How is this helpful? Thats what struck me how they want to leave a legacy, want to be remembered for something really wonderful.

El Lobo y La Paloma can be seen on-demand at Vimeo. For more of Lindsey Bourassas dance and David Camlins filmmaking work, check out their respective websites, bourassadance.com and 7cylinders.com.

Previous

Next

See the original post:
Processing grief through dance - Portland Press Herald - Press Herald

Written by admin |

September 21st, 2020 at 11:51 pm

Research Associate / Group Leader in the field of Materials Informatics data-driven approaches for job with TECHNISCHE UNIVERSITAT DRESDEN (TU…

Posted: at 11:51 pm


At TU Dresden, School of Engineering Sciences, the newly established Lab Dresden Centre for Intelligent Materials (DCIM) offers in the field of Materials Informatics data-driven approaches for materials research a position as

Research Associate / Group Leader

(Subject to personal qualification employees are remunerated according to salary group E 14 TV-L)

starting at the next possible date and limited until 31.12.2022. The period of employment is governed by the Fixed Term Research Contracts Act (Wissenschaftszeitvertragsgesetz - WissZeitVG).

The Lab Dresden Centre for Intelligent Materials (GCL DCIM) is focused on novel materials which, as a central component of intelligent systems, feel, think and act autonomously through integrated sensory and actuator functionalities. It initially consists of the two research groups Hierarchical Topologies structures with material-inherent control functions and Materials Informatics data-driven approaches for materials research and is intended to establish complementary competencies and to develop promising future research areas.

The research group Materials Informatics is concerned with data-driven approaches for the description and integration of novel tailor-made materials. Modern materials research requires an integrative and multidisciplinary approach, which increasingly relies on methods from mathematics and computer science in addition to traditional approaches from chemistry, physics and engineering. In particular, machine learning and the evaluation of "big data" are essential for tomorrow's materials research and related engineering sciences. The development of strategies for materials discovery and development are therefore the focus of Materials Informatics.

You will be part of a team of enthusiastic scientists who will creatively pursue their individual research agenda, inspired by the innovative approach and support of the Centre. Your research environment will include access to state-of-the-art research infrastructure, the promotion of gender equality and a family-friendly working environment.

Tasks: You will be integrated into the activities of the Materials Informatics research group and will interact with scientists in the field of computational materials research involved in the Dresden Centre for Computational Materials Science (DCMS). The tasks include - in the field of development and application of data-driven approaches for the description and integration of novel tailor-made materials - besides own research activities, the monitoring of the scientific development of the group, an active role in the collaboration with internal and external partners in research and industry and the acquisition of external funding. The position offers perfect perspectives for personal development within the framework of a university career from postdoc to group leader. The willingness to prepare an application for an individual junior research group (e.g. Emmy Noether Group, ERC Starting Grant) is expected and supported.

Requirements: a university degree and a doctorate are required, preferably in physics, chemistry, mechanical engineering, materials science or computer science. Personal initiative, the ability to work independently as well as team-oriented research and excellent language skills (German, English) are expected. Experience in the field of materials simulations, with a special focus on data-based and data-intensive approaches and materials genomics is considered advantageous. We are looking for a top notch proactive young scientist who wants to make a name for herself or himself in science.

Applications from women are particularly welcome. The same applies to people with disabilities.

Please submit your comprehensive application including a letter of motivation, a two-page research statement about your possible contribution to the scientific activities of the group considering the research environment at TU Dresden and the scientific environment in Dresden, CV, complete list of publications and at least two letters of reference as one single pdf-file until 22.10.2020 (stamped arrival date of the university central mail service applies) preferably via the TU Dresden SecureMail Portal https://securemail.tu-dresden.de by sending it to dcim@tu-dresden.de with the subject in the header: "Application DCIM Materials Informatics, your_surname" or by mail to TU Dresden, Fakultt Maschinenwesen, Institut fr Werkstoffwissenschaft, Professur fr Materialwissenschaft und Nanotechnik, Herrn Prof. Dr. Gianaurelio Cuniberti, Helmholtzstr. 10, 01069 Dresden. Please submit copies only, as your application will not be returned to you. Expenses incurred in attending interviews cannot be reimbursed.

Reference to data protection: Your data protection rights, the purpose for which your data will be processed, as well as further information about data protection is available to you on the website: https: //tu-dresden.de/karriere/datenschutzhinweis

More:
Research Associate / Group Leader in the field of Materials Informatics data-driven approaches for job with TECHNISCHE UNIVERSITAT DRESDEN (TU...

Written by admin |

September 21st, 2020 at 11:51 pm

Jeffrey Aronson: When I Use a Word . . . Assessment – The BMJ – The BMJ

Posted: at 11:51 pm


With severe osteoarthritis in my left hip, I am due to have a hip replacement. My having been a wicket-keeper for over 30 years cant have helped, but I also have a long-standing deformity in that hip, and the right hip is fine. So today I went for a preoperative assessment.

To assess the origins of assessment we must start with the IndoEuropean root SED, to put something down or sit.

In Latin, SED gives the verb sedre, to sit. Its not obvious, but in Greek it gives the verb , to sit. The S in the IndoEuropean root becomes an aspirate in Greek (see Box 1).

Add to the prefix , down, and you get , to sit down. Akathisia is an inability to sit still, associated with an inner feeling of restlessness. It is sometimes caused by dopamine receptor antagonists, such as chlorpromazine and haloperidol. The focal form of akathisia, the restless legs syndrome, also called Ekboms syndrome or WillisEkbom disease, if severe, is sometimes treated with dopamine receptor agonists.

Related to is the Greek verb , which also means to sit. Add a suffix, , on or over, and we get the verb, , to sit heavily on, to press down. This gives us the English prefix piezo-, referring to physical phenomena that are elicited by the application of pressure, such as the piezoelectric effect, piezomagnetism, and piezoresistance.

The Greek nouns from are and , each meaning some form of seat. This gave -, metaphorically referring to each base (or seat) of a many-sided three-dimensional figure, such as an octahedron or dodecahedron. The Greeks regarded each of the five regular polyhedra (tetrahedron, cube, octahedron, dodecahedron, and icosahedron), also called the Platonic solids, as being intrinsic to the construction of the universe. In his Mysterium Cosmographicum (1596), the German astronomer Johannes Kepler proposed a model of the solar system in which the five Platonic solids sat inside one another, interlaid with a series of inscribed and circumscribed spheres (Figure 1).

Add to and you get , a chair or throne, especially for a teacher, a professor, or a bishop. Anyone who speaks ex cathedra does so with authority, or more often with apparent authority. And a cathedral is where the bishop sits. In French a cathedra becomes a chaise, from which we have the now largely obsolete chaise-longue. In an essay titled The First Mrs. Tanqueray in a collection called While Rome Burns (1934), the American critic Alexander Woollcott quoted Mrs Patrick Campbell as having referred to The deep, deep peace of the double-bed after the hurly-burly of the chaise-longue.

Now add to and we get Ephedra, a genus of plants that trail along the ground, from which comes the adrenoceptor agonist ephedrine.

In Latin SED gave sedre, to sit, giving us sit and seat and many other English words (Figure 2). Add the prefix ad, beside or towards, and you get ad sedre, contracted to assidre, literally to sit beside. An assessor was originally one who sat beside a judge or other official, as an assistant, giving advice. When I was Oxford University Assessor some years ago, an elderly visiting Royal asked me whom I assessed. I told her that I didnt assess anyone, but assisted the Proctors, Senior and Junior, the university officials appointed annually from among the rank and file of the university, primarily to take care of student discipline and to oversee examinations.

An assessor was also one who assisted tax collectors, and so assessment came to mean fixing the rate of tax or determining the amount owed by an individual. From there it came to mean determining other matters, evaluating a person or thing or estimating the quality, value, or extent of anything.

Assessment of people normally involves others, but these days we are all being encouraged to assess ourselves, as part of our personal development, and to plan our future work on the basis of the outcomes. Is this helpful, or are we fooling ourselves? I shall look at the evidence next week.

Jeffrey Aronsonis a clinical pharmacologist, working in the Centre for Evidence Based Medicine in Oxfords Nuffield Department of Primary Care Health Sciences. He is also president emeritus of the British Pharmacological Society.

Competing interests:None declared.

Read the original here:
Jeffrey Aronson: When I Use a Word . . . Assessment - The BMJ - The BMJ

Written by admin |

September 21st, 2020 at 11:51 pm

What the world of work will look like in 2035 – Gadget

Posted: at 11:51 pm


A year-long examination of global work patterns and plans has found that, by 2035, employees will become more engaged and productive and fuel innovation and growth. This is a key finding of Work 2035, a study undertaken by Citrix Systems to understand how work will change and the role that technology will play in enabling people to perform at their best.

The key question was: What will the workforce, work models and the work environment look like in 2035? And how will technology shape them?

Citrix teamed up with futurist consultancy Oxford Analytica and business research specialist Coleman Parkes to survey over 500 C-Suite leaders and 1,000 employees within large corporations and mid-market businesses across the United States, United Kingdom, Germany, France and the Netherlands on current and future workforce strategies and work models.

The main conclusions were:

Robots will not replace humans But they will make us smarter and more efficient. More than three-quarters of those polled (77 percent) believe that in fifteen years, artificial intelligence (AI) will significantly speed up the decision-making process and make workers more productive.

New Jobs will be created New roles will emerge to support a technology-driven workplace and the changing relationship between humans and machines. Here are the positions respondents believe will be created:

Work will be more flexible Technology that allows for seamless access to the tools and information people need to collaborate and get work done wherever they happen to be will fuel flexible models that the future of work will demand.

Leadership will have a new look More than half of those surveyed (57 percent) believe AI will make most business decisions and potentially eliminate the need for senior management teams.

Productivity will get a major boost Technology, closely integrated with humans, will drive step changes in productivity as workers are supported by solutions that enable them to perform at their best. AI-ngels digital assistants driven by AI will draw on personal and workplace data to help employees prioritize their tasks and time and ensure mental and physical wellness. These worker augmented assistants will, for example, schedule meetings to take place at the most effective time based on factors ranging from the blood sugar levels of participants to their sentiments at different times of day. And while the meetings are taking place, they will monitor concentration levels and attitudes and adjust as necessary to drive optimal outcomes.

More than half of professionals surveyed (51 percent) believe technology will make workers at least twice as productive by 2035. Among the solutions they believe will be commonplace:

Employee engagement will improve As technology and AI takes over time-consuming, mundane tasks, work will become more strategic and employees more engaged.

Innovation and growth will soar Organizations will invest more in technology and AI than human capital. This will open the door to unprecedented levels of innovation and new revenue streams and fuel sustainable growth particularly among small businesses.

The COVID-19 pandemic has forced companies to reimagine the way things get done, and over the next 15 years, they will face more challenges and disruptions than ever, says Tim Minahan, executive vice president of business strategy at Citrix. But as Work 2035 makes clear, within this chaos lies opportunity. Savvy companies are using this crisis to begin planning for the next normal. Not just return to where they were, but to embrace new workforce and work models to power their business forward.

* Click here to download a complimentary copy of Work 2035.

Follow this link:
What the world of work will look like in 2035 - Gadget

Written by admin |

September 21st, 2020 at 11:51 pm

Uncertain Times for the Financial Sector Is Open Source the Solution? – Global Banking And Finance Review

Posted: at 11:51 pm


By Piers Wilson, Head of Product Management atHuntsman Security

The Financial Reporting Council (FRC), which is responsible for corporate governance, reporting and auditing in the UK, has been consulting on the role of technology in audit processes. This highlights growing recognition for the fact that technology can assist audits, providing the ability to automate data gathering or assessment to increase quality, remove subjectivity and make the process more trustworthy and consistent. Both theBrydon reviewand the latestAQR thematicsuggest a link between enhanced audit quality and the increasing use of technology. This goes beyond efficiency gains from process automation and relates, in part, to the larger volume of data and evidence which can be extracted from an audited entity and the sophistication of the tools available to interrogate it.

As one example, thePCAOBin the US has for a while advocated for the provision of audit evidence and reports to be timely (which implies computerisation and automation) to assure that risks are being managed, and for the extent of human interaction with evidence or source data to be reflected to ensure influence is minimised (the more that can be achieved programmatically and objectively the better).

However, technology may obscure the nature of analysis and decision making and create a barrier to fully transparent audits compared to more manual (yet labour intensive) processes.There is also a competition aspect between larger firms and smaller ones as regards access to technology:

Brydonraised concerns about the ability of challenger firms to keep pace with the Big Four firms in the deployment of innovative new technology.

The FRC consultation paper covers issues, and asks questions, in a number of areas.Examples include:

Clearly these are real issues for a process that aims to provide trustworthy, objective, transparent and repeatable outputs any use of technology to speed up or improve the process must maintain these standards.

Audit technology solutions in cyber security

The cyber security realm has grown to quickly become a major area of risk and hence a focus for boards, technologists and auditors alike. The highly technical nature of threats and the adversarial nature of cybers attackers (who will actively try and find/exploit control failures) means that technology solutions that identify weaknesses and report on specific or overall vulnerabilities are becoming more entrenched in the assurance process within this discipline.

While the audit consultations and reports mentioned above cover the wider audit spectrum, similar challenges relate to cyber security as an inherently technology-focussed area of operation.

Benefits of speed

The gains from using technology to conduct data gathering, analysis and reporting are obvious removing the need for human questionnaires, interviews, inspections and manual number crunching. Increasing the speed of the process has a number of benefits:

Benefits of flexibility

The ability to conduct audits across different sites or scopes, to specify different thresholds of risk for different domains, the ease of conducting audits at remote locations or on suppliers networks (especially during period of restricted travel) are ALL factors that can make technology a useful tool for the auditor.

Benefits of transparency

One part of the FRCs perceived problem space is that of transparency, you can ask a human how they derived a result, and they can probably tell you, or at least show you the audit trail of correspondence, meeting notes or spreadsheet calculations.But can you do this with software or technology?

Certainly, the use of AI and machine learning makes this hard, the learning nature and often black box calculations are not easy to either understand, recalculate in a repeatable way or to document. The system learns, so is always changing, and hence the rationale that a decision might not always be the same.

In technologies that are geared towards delivering audit outcomes this is easier.First, if you collect and retain data, provide an easy interface to go from results to the underlying cases in the source data, it is possible to take a score/rating/risk and reveal the specifics of what led to it.Secondly, it is vital that the calculations are transparent, i.e. that the methods of calculating risks or the way results are scored is decipherable.

Benefits of consistency

This is one obvious gain from technology, the logic is pre-programmed in. If you take two auditors and give them the same data sets or evidence case files they might draw different conclusions (possibly for valid reasons or due to them having different skill areas or experience), but the same algorithm operating on the same data will produce the same result every time.

Manual evidence gathering suffers a number of drawbacks it relies on written notes, records of verbal conversations, email trails, spreadsheets, or questionnaire responses in different formats. Retaining all this in a coherent way is difficult and going back through it even harder.

Using a consistent toolset and consistent data format means that if you need to go back to a data source from a particular network domain three months ago, you will have information that is readily available and readable. And as stated above, if the source data and evidence is re-examined using a consistent solution, you will get the same calculations, decisions and results.

Benefits of systematically generated KPIs, cyber maturity measuresand issues

The outputs of any audit process need to provide details of the issues found so that the specific or general cases of the failures can be investigated and resolved. But for managers, operational teams and businesses, having a view of the KPIs for the security operations process is extremely useful.

Of course, following the lines of defence model, an internal or external formal audit might simply want the results and a level of trust in how they were calculated; however for operational management and ongoing continuous visibility, the need to derive performance statistics comes into its own.

It is worth noting that there are two dimensions to KPIs: The assessment of the strength or configuration of a control or policy (how good is the control) and the extent or level of coverage (how widely is it enforced).

To give a view of the technical maturity of a defence you really need to combine these two factors together. A weak control that is widely implemented or a strong control that provides only partial coverage are both causes for concern.

Benefits of separation of process stages

The final area where technology can help is in allowing the separation and distribution of the data gathering, analysis and reporting processes. It is hard to take the data, evidence and meeting notes from someone else and analyse it.For one thing, is it trustworthy and reliable (in the case of third-party assurance questionnaires perhaps)?Then it is also hard to draw high-level conclusions about the analysis.

If technology allows the data gathering to be performed in a distributed way, say by local site administrators, third-party IT staff or non-expert users BUT in a trustworthy way, then the overhead of the audit process is much reduced. Instead of a team having to conduct multiple visits, interviews or data collection activities the toolset can be provided to the people nearest to the point of collection.

This allows the data analysis and interpretation to be performed centrally by the experts in a particular field or control area.So giving a non-expert user a way to collect and provide relevant and trustworthy audit evidence takes a large bite out of the resource overhead of conducting the audit, for both auditor and auditee.

It also means that a target organisation doesnt have to manage the issue of allowing auditors to have access to networks, sites, data, accounts and systems to gather the audit evidence as this can be undertaken by existing administrators in the environment.

Making the right choice

Technology solutions in the audit process can clearly deliver benefits, however if they are too simplistic or aim to be too clever, they can simply move the problem of providing high levels of audit quality. A rapidly generated AI-based risk score is useful, but if its not possible to understand the calculation it is hard to either correct the control issues or trouble shoot the underlying process.

Where technology can assist the audit process, speed up data gathering and analysis, and streamline the generation of high- and low-level outputs it can be a boon.

Technology allows organisations to put trustworthy assurance into the hands of operations teams and managers, consultants and auditors alike to provide flexible, rapid and frequent views of control data and understanding of risk posture. If this can be done in a way that is cognisant of the risks and challenges as we have shown, then auditors and regulators such as theFRCcan be satisfied.

See the rest here:
Uncertain Times for the Financial Sector Is Open Source the Solution? - Global Banking And Finance Review

Written by admin |

September 21st, 2020 at 11:51 pm

The end of the cookie and the new era of digital marketing – Global Banking And Finance Review

Posted: at 11:51 pm


By Piers Wilson, Head of Product Management atHuntsman Security

The Financial Reporting Council (FRC), which is responsible for corporate governance, reporting and auditing in the UK, has been consulting on the role of technology in audit processes. This highlights growing recognition for the fact that technology can assist audits, providing the ability to automate data gathering or assessment to increase quality, remove subjectivity and make the process more trustworthy and consistent. Both theBrydon reviewand the latestAQR thematicsuggest a link between enhanced audit quality and the increasing use of technology. This goes beyond efficiency gains from process automation and relates, in part, to the larger volume of data and evidence which can be extracted from an audited entity and the sophistication of the tools available to interrogate it.

As one example, thePCAOBin the US has for a while advocated for the provision of audit evidence and reports to be timely (which implies computerisation and automation) to assure that risks are being managed, and for the extent of human interaction with evidence or source data to be reflected to ensure influence is minimised (the more that can be achieved programmatically and objectively the better).

However, technology may obscure the nature of analysis and decision making and create a barrier to fully transparent audits compared to more manual (yet labour intensive) processes.There is also a competition aspect between larger firms and smaller ones as regards access to technology:

Brydonraised concerns about the ability of challenger firms to keep pace with the Big Four firms in the deployment of innovative new technology.

The FRC consultation paper covers issues, and asks questions, in a number of areas.Examples include:

Clearly these are real issues for a process that aims to provide trustworthy, objective, transparent and repeatable outputs any use of technology to speed up or improve the process must maintain these standards.

Audit technology solutions in cyber security

The cyber security realm has grown to quickly become a major area of risk and hence a focus for boards, technologists and auditors alike. The highly technical nature of threats and the adversarial nature of cybers attackers (who will actively try and find/exploit control failures) means that technology solutions that identify weaknesses and report on specific or overall vulnerabilities are becoming more entrenched in the assurance process within this discipline.

While the audit consultations and reports mentioned above cover the wider audit spectrum, similar challenges relate to cyber security as an inherently technology-focussed area of operation.

Benefits of speed

The gains from using technology to conduct data gathering, analysis and reporting are obvious removing the need for human questionnaires, interviews, inspections and manual number crunching. Increasing the speed of the process has a number of benefits:

Benefits of flexibility

The ability to conduct audits across different sites or scopes, to specify different thresholds of risk for different domains, the ease of conducting audits at remote locations or on suppliers networks (especially during period of restricted travel) are ALL factors that can make technology a useful tool for the auditor.

Benefits of transparency

One part of the FRCs perceived problem space is that of transparency, you can ask a human how they derived a result, and they can probably tell you, or at least show you the audit trail of correspondence, meeting notes or spreadsheet calculations.But can you do this with software or technology?

Certainly, the use of AI and machine learning makes this hard, the learning nature and often black box calculations are not easy to either understand, recalculate in a repeatable way or to document. The system learns, so is always changing, and hence the rationale that a decision might not always be the same.

In technologies that are geared towards delivering audit outcomes this is easier.First, if you collect and retain data, provide an easy interface to go from results to the underlying cases in the source data, it is possible to take a score/rating/risk and reveal the specifics of what led to it.Secondly, it is vital that the calculations are transparent, i.e. that the methods of calculating risks or the way results are scored is decipherable.

Benefits of consistency

This is one obvious gain from technology, the logic is pre-programmed in. If you take two auditors and give them the same data sets or evidence case files they might draw different conclusions (possibly for valid reasons or due to them having different skill areas or experience), but the same algorithm operating on the same data will produce the same result every time.

Manual evidence gathering suffers a number of drawbacks it relies on written notes, records of verbal conversations, email trails, spreadsheets, or questionnaire responses in different formats. Retaining all this in a coherent way is difficult and going back through it even harder.

Using a consistent toolset and consistent data format means that if you need to go back to a data source from a particular network domain three months ago, you will have information that is readily available and readable. And as stated above, if the source data and evidence is re-examined using a consistent solution, you will get the same calculations, decisions and results.

Benefits of systematically generated KPIs, cyber maturity measuresand issues

The outputs of any audit process need to provide details of the issues found so that the specific or general cases of the failures can be investigated and resolved. But for managers, operational teams and businesses, having a view of the KPIs for the security operations process is extremely useful.

Of course, following the lines of defence model, an internal or external formal audit might simply want the results and a level of trust in how they were calculated; however for operational management and ongoing continuous visibility, the need to derive performance statistics comes into its own.

It is worth noting that there are two dimensions to KPIs: The assessment of the strength or configuration of a control or policy (how good is the control) and the extent or level of coverage (how widely is it enforced).

To give a view of the technical maturity of a defence you really need to combine these two factors together. A weak control that is widely implemented or a strong control that provides only partial coverage are both causes for concern.

Benefits of separation of process stages

The final area where technology can help is in allowing the separation and distribution of the data gathering, analysis and reporting processes. It is hard to take the data, evidence and meeting notes from someone else and analyse it.For one thing, is it trustworthy and reliable (in the case of third-party assurance questionnaires perhaps)?Then it is also hard to draw high-level conclusions about the analysis.

If technology allows the data gathering to be performed in a distributed way, say by local site administrators, third-party IT staff or non-expert users BUT in a trustworthy way, then the overhead of the audit process is much reduced. Instead of a team having to conduct multiple visits, interviews or data collection activities the toolset can be provided to the people nearest to the point of collection.

This allows the data analysis and interpretation to be performed centrally by the experts in a particular field or control area.So giving a non-expert user a way to collect and provide relevant and trustworthy audit evidence takes a large bite out of the resource overhead of conducting the audit, for both auditor and auditee.

It also means that a target organisation doesnt have to manage the issue of allowing auditors to have access to networks, sites, data, accounts and systems to gather the audit evidence as this can be undertaken by existing administrators in the environment.

Making the right choice

Technology solutions in the audit process can clearly deliver benefits, however if they are too simplistic or aim to be too clever, they can simply move the problem of providing high levels of audit quality. A rapidly generated AI-based risk score is useful, but if its not possible to understand the calculation it is hard to either correct the control issues or trouble shoot the underlying process.

Where technology can assist the audit process, speed up data gathering and analysis, and streamline the generation of high- and low-level outputs it can be a boon.

Technology allows organisations to put trustworthy assurance into the hands of operations teams and managers, consultants and auditors alike to provide flexible, rapid and frequent views of control data and understanding of risk posture. If this can be done in a way that is cognisant of the risks and challenges as we have shown, then auditors and regulators such as theFRCcan be satisfied.

Originally posted here:
The end of the cookie and the new era of digital marketing - Global Banking And Finance Review

Written by admin |

September 21st, 2020 at 11:51 pm

Proximity matters: Using machine learning and geospatial analytics to reduce COVID-19 exposure risk – Healthcare IT News

Posted: September 20, 2020 at 10:56 pm


Since the earliest days of the COVID-19 pandemic, one of the biggest challenges for health systems has been to gain an understanding of the community spread of this virus and to determine how likely is it that a person walking through the doors of a facility is at a higher risk of being COVID-19 positive.

Without adequate access to testing data, health systems early-on were often forced to rely on individuals to answer questions such as whether they had traveled to certain high-risk regions. Even that unreliable method of assessing risk started becoming meaningless as local community spread took hold.

Parkland Health & Hospital System, the safety net health system for Dallas County, Texas, and PCCI, a Dallas-based non-profit with expertise in the practical applications of advanced data science and social determinants of health, had a better idea.

Community spread of an infectious disease is made possible through physical proximity and density of active carriers and non-infected individuals. Thus, to understand the risk of an individual contracting the disease (exposure risk), it was necessary to assess their proximity to confirmed COVID-19 cases based on their address and population density of those locations.

If an "exposure risk" index could be created, then Parkland could use it to minimize exposure for their patients and health workers and provide targeted educational outreach in highly vulnerable zip codes.

PCCIs data science and clinical team worked diligently in collaboration with the Parkland Informatics team to develop an innovative machine learning driven predictive model called Proximity Index. Proximity Index predicts for an individuals COVID-19 exposure risk, based on their proximity to test positive cases and the population density.

This model was put into action at Parkland through PCCIs cloud-based advanced analytics and machine learning platform called Isthmus. PCCIs machine learning engineering team generated geospatial analysis for the model and, with support from the Parkland IT team, integrated it with their electronic health record system.

Since April 22, Parklands population health team has utilized the Proximity Index for four key system-wide initiatives to triage more than 100,000 patient encounters and to assess needs, proactively:

In the future, PCCI is planning on offering Proximity Index to other organizations in the community schools, employers, etc., as well as to individuals to provide them with a data driven tool to help in decision making around reopening the economy and society in a safe, thoughtful manner.

Many teams across the Parkland family collaborated on this project, including the IT team led by Brett Moran, MD, Senior Vice President, Associate Chief Medical Officer and Chief Medical Information Officer at Parkland Health and Hospital System.

See more here:

Proximity matters: Using machine learning and geospatial analytics to reduce COVID-19 exposure risk - Healthcare IT News

Written by admin |

September 20th, 2020 at 10:56 pm

Posted in Machine Learning

PREDICTING THE OPTIMUM PATH – Port Strategy

Posted: at 10:56 pm


A joint venture has seen the implementation of machine learning at HHLAs Container Terminal Burchardkai to optimise import container yard positioning and reduce re-handling moves.

The elimination of costly re-handling moves of import containers has recently been the focus of a joint project between container terminal operator HHLA, its affi liate Hamburg Port Consulting (HPC) and INFORM the Artificial Intelligence (AI) systems supplier. Machine learning sits at the heart of the system.

Dwell time is the unit of time used to measure the period in which a container remains in a container terminal with this typically running from its arrival off a vessel until leaving the terminal via truck, rail or another vessel.

For import containers there is often no specific information available on the pick-up time when selecting a storage slot in the container stack. This can lead to an inefficient container storage location in the yard generating, in turn, the requirement for additional shuffle moves that require extra resources including maintenance and energy consumption.

To mitigate this operational inefficiency, the project partners - HHLA, HPC and INFORM - have recently run a pilot project at HHLAs Container Terminal Burchardkai (CTB) focused on machine learning technology with this applied in order to predict individual import container dwell times and thereby reduce costly re-handling/shuffle moves.

As a specialist in IT software integration and terminal operations, HPC employed the deep learning approach to identify hidden patterns from historical data of container moves at HHLA CTB. This was undertaken over a period of two years and with the acquired information processed into high quality data sets. Assessed by the Syncrotess Machine Learning Module from INFORM and validated by the HPC simulation tool, the results show a significant reduction of shuffle moves resulting in a reduced truck turn time.

PRODUCTIVE IMPLEMENTATION

Dr. Alexis Pangalos, Partner at HPC discussing the project highlights notes: It was a productive implementation of INFORMs Artificial Intelligence (AI) solution for the choice of container storage positions at CTB. The Machine Learning (ML) Module was trained with data from CTBs container handling operations and the outcome from this is a system tailor-made for HHLAs operations.

HPC together with INFORM have integrated the Syncrotess ML Module into the slot allocation algorithms already running within CTBs terminal control system, ITS.

PREDICTING DWELL TIME

INFORMs AI solution predicts the dwell time (i.e., the time period the container is expected to be stored in the yard) and the outbound mode of transport (e.g., rail, truck, vessel) both of which are crucial criteria for selecting an optimised container storage location within the yard. A location that avoids unnecessary re-handling.

Utilising machine learning and AI and integrating these technologies into existing IT infrastructure are the success factors for reaching the next level of optimisations, says Jens Hansen, Executive Board Member responsible for IT at HHLA. A detailed analysis, and a smooth interconnectivity between all different systems, enable the value of improved safety while reducing costs and greenhouse gas emissions, he underlines.

DETAILED DOMAIN KNOWLEDGE

Data availability and data processing are key elements when it comes to utilising AI technology, says Pangalos. It requires a detailed domain knowledge of terminal operations to unlock greater productivity of the terminal equipment and connected processes.

The implementation is based on a machine learning assessment INFORM undertook in 2018 whereby it set out to determine if they could improve optimisation and operational outcomes using INFORMs broader ML algorithms developed for use in other industries such as finance and aviation.

As of 2019, system results indicated a prediction accuracy of 26% for dwell time predictions and 33% for outbound mode of transport predictions.

Dr. Eva Savelsberg, Senior Vice President of INFORMs Logistic Division notes: AI and machine learning allows us to leverage data from our past performance to inform us about how best to approach our future operations our ML Module gives our Operations Research based algorithms the best footing for making complex decisions about what to do in the future.

INFORMs Machine Learning Module allows CTB to leverage insights generated from algorithms that continuously learn from historical data."

Further Information: Matthew Wittemeier m.wittemeier@inform-software.com

Continued here:

PREDICTING THE OPTIMUM PATH - Port Strategy

Written by admin |

September 20th, 2020 at 10:56 pm

Posted in Machine Learning

What is ‘custom machine learning’ and why is it important for programmatic optimisation? – The Drum

Posted: at 10:56 pm


Wayne Blodwell, founder and chief exec of The Programmatic Advisory & The Programmatic University, battles through the buzzwords to explain why custom machine learning can help you unlock differentiation and regain a competitive edge.

Back in the day, simply having programmatic on plan was enough to give you a competitive advantage and no one asked any questions. But as programmatic has grown, and matured (84.5% of US digital display spend is due to be bought programmatically in 2020, the UK is on track for 92.5%), whats next to gain advantage in an increasingly competitive landscape?

Machine Learning

[noun]

The use and development of computer systems that are able to learn and adapt without following explicit instructions, by using algorithms and statistical models to analyse and draw inferences from patterns in data.

(Oxford Dictionary, 2020)

Youve probably head of machine learning as it exists in many Demand Side Platforms in the form of automated bidding. Automated bidding functionality does not require a manual CPM bid input nor any further bid adjustments instead, bids are automated and adjusted based on machine learning. Automated bids work from goal inputs, eg achieve a CPA of x or simply maximise conversions, and these inputs steer the machine learning to prioritise certain needs within the campaign. This tool is immensely helpful in taking the guesswork out of bids and the need for continual bid intervention.

These are what would be considered off-the-shelf algorithms, as all buyers within the DSP have access to the same tool. There is a heavy reliance on this automation for buying, with many even forgoing traditional optimisations for fear of disrupting the learnings and holding it back but how do we know this approach is truly maximising our results?

Well, we dont. What we do know is that this machine learning will be reasonably generic to suit the broad range of buyers that are activating in the platforms. And more often than not, the functionality is limited to a single success metric, provided with little context, which can isolate campaign KPIs away from their true overarching business objectives.

Custom machine learning

Instead of using out of the box solutions, possibly the same as your direct competitors, custom machine learning is the next logical step to unlock differentiation and regain an edge. Custom machine learning is simply machine learning that is tailored towards specific needs and events.

Off-the-self algorithms are owned by the DSPs; however, custom machine learning is owned by the buyer. The opportunity for application is growing, with leading DSPs opening their APIs and consoles to allow for custom logic to be built on top of existing infrastructure. Third party machine learning partners are also available, such as Scibids, MIQ & 59A, which will develop custom logic and add a layer onto the DSPs to act as a virtual trader, building out granular strategies and approaches.

With this ownership and customisation, buyers can factor in custom metrics such as viewability measurement and feed in their first party data to align their buying and success metrics with specific business goals.

This level of automation not only provides a competitive edge in terms of correctly valuing inventory and prioritisation, but the transparency of the process allows trust to rightfully be placed with automation.

Custom considerations

For custom machine learning to be effective, there are a handful of fundamental requirements which will help determine whether this approach is relevant for your campaigns. Its important to have conversations surrounding minimum event thresholds and campaign size with providers, to understand how much value you stand to gain from this path.

Furthermore, a custom approach will not fix a poor campaign. Custom machine learning is intended to take a well-structured and well-managed campaign and maximise its potential. Data needs to be inline for it to be adequately ingested and for real insight and benefit to be gained. Custom machine learning cannot simply be left to fend for itself; it may lighten the regular day to day load of a trader, but it needs to be maintained and closely monitored for maximum impact.

While custom machine learning brings numerous benefits to the table transparency, flexibility, goal alignment its not without upkeep and workflow disruption. Levels of operational commitment may differ depending on the vendors selected to facilitate this customisation and their functionality, but generally buyers must be willing to adapt to maximise the potential that custom machine learning holds.

Find out more on machine learning in a session The Programmatic University are hosting alongside Scibids on The Future Of Campaign Optimisation on 17 September. Sign up here.

Read more:

What is 'custom machine learning' and why is it important for programmatic optimisation? - The Drum

Written by admin |

September 20th, 2020 at 10:56 pm

Posted in Machine Learning


Page 758«..1020..757758759760..770780..»



matomo tracker