نتایج جستجو برای: distilling

تعداد نتایج: 796  

Journal: :Journal of information security and applications 2021

When handling a security incident, there is lot of information that needs to be stored, processed, and analyzed. As result the volume necessity deal with incident investigation promptly, different forensic tools have been developed provide cyber threat intelligence response management platforms solutions. These enable responders effectively collaborate in identifying investigating incidents, ma...

Journal: :Journal of the science of food and agriculture 2010
Andrew M Watson Martin C Hare Peter S Kettlewell James M Brosnan Reginald C Agu

BACKGROUND Since demand for distilling wheat is expected to increase rapidly as a result of the development of the bioethanol industry, efficient production will become of increasing importance. Achieving this will require an understanding of the agronomic factors that influence both grain yield and alcohol yield. Therefore five field experiments using the winter distilling wheat variety Glasgo...

Journal: :IEEE Transactions on Information Theory 2004

Journal: :Lecture Notes in Computer Science 2021

Generative Adversarial Networks (GANs) have become a powerful approach for generative image modeling. However, GANs are notorious their training instability, especially on large-scale, complex datasets. While the recent work of BigGAN has significantly improved quality generation ImageNet, it requires huge model, making hard to deploy resource-constrained devices. To reduce model size, we propo...

Journal: :Journal of autism and developmental disorders 2016
Clara M Lajonchere Barbara Y Wheeler Thomas W Valente Cary Kreutzer Aron Munson Shrikanth Narayanan Abe Kazemzadeh Roxana Cruz Irene Martinez Sheree M Schrager Lisa Schweitzer Tara Chklovski Darryl Hwang

Low income Hispanic families experience multiple barriers to accessing evidence-based information on Autism Spectrum Disorders (ASD). This study utilized a mixed-strategy intervention to create access to information in published bio-medical research articles on ASD by distilling the content into parent-friendly English- and Spanish-language ASD Science Briefs and presenting them to participants...

Journal: :Lecture Notes in Computer Science 2023

Although deep neural networks have enjoyed remarkable success across a wide variety of tasks, their ever-increasing size also imposes significant overhead on deployment. To compress these models, knowledge distillation was proposed to transfer from cumbersome (teacher) network into lightweight (student) network. However, guidance teacher does not always improve the generalization students, espe...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید