Difference between revisions of "Team:SEU/Contribution"

 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
{{SEU/Header}}
 
{{SEU/Header}}
 
<html>
 
<html>
<head><script src="https://2019.igem.org/common/MathJax-2.5-latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML"></script></head>
 
 
<style>
 
<style>
 
.buttonContri {
 
.buttonContri {
     background-color: #4CAF50 !important;
+
     background-color: #088A29 !important;
 
     border: none !important;
 
     border: none !important;
 
     color: white !important;
 
     color: white !important;
 +
    border-radius: 6px;
 
     padding: 16px 32px !important;
 
     padding: 16px 32px !important;
 
     text-align: center !important;
 
     text-align: center !important;
Line 27: Line 27:
 
                                   <div class="about-contentbox">
 
                                   <div class="about-contentbox">
 
                                       <div>
 
                                       <div>
                                           <h2>Model</h2>
+
                                           <h2>Contribution</h2>
                                          <h3>Computation method</h3>
+
                                           <p style="font-size=36px">Through the comprehensive use of life science and information science knowledge, we have obtained the results of this experiment. In the whole process, we encountered many difficulties and challenges, but after careful thinking and practice, we finally successfully overcome these. In addition, we have also summarized some information that may be helpful to other teams, hoping to make some contributions to the iGEM community. </p>
                                           <p >Addition: \(A_1 \xrightarrow{k_1} O,\quad A_2 \xrightarrow{k_2} O\)</p>
+
 
                                           <p style="font-size=24px">Proof: \(\dfrac{d [A_i](t)}{d t}=-k_i[A_i](t)\) \(\Rightarrow [A_i](t)=[A_i](0)e^{-k_it}, \) \(\dfrac{d [O](t)}{d t}=\sum_{i=1}^2 k_i[A_i](t)\) \(\Rightarrow [O](\infty)=\int_0^\infty \sum_{i=1}^2 k_i[A_i](t)dt = [A_1](0)/k_1+[A_2](0)/k_2.\) If \(k_1\approx k_2\), then addition is successfully implemented.</p>
+
                                           <p style="font-size=36px">1. We propose molecular computation models for arithmetic operations in artificial neural networks as well as relevant reaction kinetic analysis. </p>
                                           <p style="font-size=24px">Subtraction: \(A+B \xrightarrow{k_1} \phi\)</p>
+
                                           <center><a href="https://2019.igem.org/Team:SEU/Model" class="buttonContri">Model</a></center>
                                           <p style="font-size=24px">Proof: It is identical with [1]. Apparently, \([A](t)=[B](t)+\Delta \). <br>If \(\Delta \neq 0\), \(\dfrac{d [A](t)}{d t}=-[A](t)([A](t)-\Delta)\) \(\Rightarrow [A](t)=\dfrac{[A](0)\Delta}{-[A](0)+[A](0)e^{\Delta t}+\Delta e^{\Delta t}} (\Delta \neq 0).\) If \(\Delta > 0\), \([A](\infty)=\Delta\). Otherwise \([A](\infty)=0\).
+
                                         
                                           <br>If \(\Delta =0\), \([A](t)=\dfrac{[A](0)}{1+[A](0)t}\). \([A](\infty)=0\). Hence substraction is implemented.</p>
+
                                           <p style="font-size=36px">2. Based on these models, we implement artificial neurons, which is the basic element of neural networks, with DNA reactions. Also, we achieve backpropagation training process for DNA-based neural networks.</p>
                                           <p style="font-size=24px">Multiplication: \(\alpha \xrightarrow{k_1} \phi, A+B+\alpha \xrightarrow{k_2} A+B+\alpha+C\)</p>
+
                                           <center><a href="https://2019.igem.org/Team:SEU/Demonstrate" class="buttonContri">Demonstrate</a></center>
                                           <p style="font-size=24px">Proof: \(\dfrac{d [\alpha](t)}{d t}=-k_1[\alpha](t)\) \(\Rightarrow [\alpha](t)=[\alpha](0)e^{-k_1t},\) \(\dfrac{d [A](t)}{d t}=\dfrac{d [B](t)}{d t}=0, \dfrac{d [C](t)}{d t}=k_2[A](t)[B](t)[\alpha](t)\) \(\Rightarrow [C](\infty)=\int_0^\infty [A](0)[B](0)[\alpha](t)=k_2/k_1[\alpha](0)[A](0)[B](0)\). Hence multiplication is implemented.</p>
+
 
                                          <h3>References</h3>
+
                                           <p style="font-size=36px">3. To help with experiment, we develop a software tool which can generate DNA reactions and relevant DNA sequences accroding to the input size of the neural networks. Researchers can directly use this tool to abtain their exepected DNA-based neural networks and conduct experiments.</p>
                                           <p>[1]C. Fang, Z. Shen, Z. Zhang, X. You and C. Zhang, "Synthesizing a Neuron Using Chemical Reactions," 2018 IEEE International Workshop on Signal Processing Systems (SiPS), Cape Town, 2018, pp. 187-192.</p>
+
                                           <center><a href="https://2019.igem.org/Team:SEU/Software" class="buttonContri">Software</a></center>
                                           <p>[2]M. Vasic, D. Soloveichik, S. Khurshid, "CRN++: Molecular Programming Language." arXiv preprint arXiv 1809.07430.</p>
+
                                           <p style="font-size=36px">4. We also conduct DNA experiments and try to validate our theory. qRT-PCR results and PAGE results are obtained, analysis of experiment data is also provided.</p>
                                                                           
+
                                           <center><a href="https://2019.igem.org/Team:SEU/Experiments" class="buttonContri">Experiments</a></center>
 +
                                         
 
                                       </div>
 
                                       </div>
 
                                 </div>
 
                                 </div>

Latest revision as of 13:24, 19 October 2019





Contribution

Through the comprehensive use of life science and information science knowledge, we have obtained the results of this experiment. In the whole process, we encountered many difficulties and challenges, but after careful thinking and practice, we finally successfully overcome these. In addition, we have also summarized some information that may be helpful to other teams, hoping to make some contributions to the iGEM community.

1. We propose molecular computation models for arithmetic operations in artificial neural networks as well as relevant reaction kinetic analysis.

Model

2. Based on these models, we implement artificial neurons, which is the basic element of neural networks, with DNA reactions. Also, we achieve backpropagation training process for DNA-based neural networks.

Demonstrate

3. To help with experiment, we develop a software tool which can generate DNA reactions and relevant DNA sequences accroding to the input size of the neural networks. Researchers can directly use this tool to abtain their exepected DNA-based neural networks and conduct experiments.

Software

4. We also conduct DNA experiments and try to validate our theory. qRT-PCR results and PAGE results are obtained, analysis of experiment data is also provided.

Experiments