0% found this document useful (0 votes)
30 views

DL & CD

The document outlines the Continuous Evaluation Test (CET)-2 for the B. Tech CSE/IT program at ITM (SLS) Baroda University, focusing on Deep Learning and Compiler Design. It includes instructions for the test, multiple-choice questions, and short answer prompts related to various topics in deep learning and compiler theory. The test is scheduled for September 25, 2023, and September 29, 2023, with a total of 30 marks for each subject.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views

DL & CD

The document outlines the Continuous Evaluation Test (CET)-2 for the B. Tech CSE/IT program at ITM (SLS) Baroda University, focusing on Deep Learning and Compiler Design. It includes instructions for the test, multiple-choice questions, and short answer prompts related to various topics in deep learning and compiler theory. The test is scheduled for September 25, 2023, and September 29, 2023, with a total of 30 marks for each subject.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

11:Mt

-.-111111• UNtVI tti,n


'7,'lo}llt' !!•,
.' ll''l.it ~,,_:J.

STUDENT'S ENROLMENT NUMBER._ _ _ _ __

ITM (SLS) BARODA UNIVERSITY


OLOGY (SOCSET)
SCHOOL OF COMPUTER SCIENCE, ENGINEERING AND TECHN
B. TECH CSE/IT//Al/OS ODD SEMESTER 2023-24
CONTINUOUS EVALUATION TEST (CET)-2
SEMESTER:7 COURSE-CODE: C2720C2 COURSE-NAME: DEEP LEARNING
DATE:25/9/2023 MARKS: 30 TIME: 10:30 AM to 12:00 PM

Instructions:
.
• All questions are mandatory. There are no external options
them clearly.
• Make suitable assumptions, where ver necessary, and state
d.
• Use of Non-Programmable Calculator is.allow ed/Not allowe
• Figures to the right indicate maximum marks.

Ql. [6J
1. In CNNs, which operation allows for parameter sharing?
a) Fully connected layers
b) Pooling layers
~onv oluti onal layers
d) Activation layers

2. Regularization methods like Ll and L2 are used to:


a) Increase model complexity
~) Reduce model complexity
c) Enhance model accuracy
d) Decrease model interp retabi lity

tage of processing
3. In a Bidirectional RNN (Bi-RNN), what is the primary advan
input sequences in both forwa rd and backward directions?
a) It reduces the computational complexity of the network.
b) It allows the netwo rk to process sequences in parallel.
elements.
-c)' It captures context from both preceding and following
d) It prevents overfi tting in the model

4. What is a limitat ion of undercompfete autoencoders?


a) They are computationally expensive to train
_bYrhey may struggle to capture complex data patterns
c) They always result in overfi tting
d) They require a large amou nt of labelled data

que applied to
5. Which of the following is a common regularizatior-i techni
prevent overfi tting in autoencoder models?
a) Data augmentation
~eight Initialization
4'Dropout
d) Batch normalization

6. How can the challenges in training Boltzmann Machines (BMs) be mitigated?


By using a larger number of hidden units
b) By using a smaller number of hidden units
c) By increasing the learning rate
d) By using dropout regularization

Q3. Ay&wer any Two (out of Four) [6]


~ : Write Short note on DenseNet.
In Deep Learning, where we can use Transfer Learning approach?
Draw architecture of Bi-directional RNN and explain in detail.
4. What are the fundamental principles of Boltzmann Machines (BM)?

Q3. *Answer any Two (out of Four) [6]


~Short note PixelNet.
What are the Transfer Learning Techniques?
X How DenseNet is good compare to CNN?
Explain the concept of regularized autoencoders.

Q4. Answer any Two (out of Four) [6]


What is called Sequence Modelling in Deep Learning?
Provide an overview of Long Short-Term Memory (LSTM) networks.
~hart note Undercomplete Encoder.
~ifference: Recurrent Neural Networks and Recursive Neural Networks.

QS. Answer any Two (out of Four) [6]


Describe the architecture of Bi-directional RNN, how does it work?
~plain the concept of Backpropagation Through Time (BPTT) in the
context of training recurrent neural networks.
~ r i t e Short Note on Stochastic Encoders and Decoders.
\5 t:xplain concept of Parameter Sharing in Deep Learning .


Enrolment No:
Roll No:

ITM (SLS) Baroda University, Vadodara nology)


ool of Computer Science, Engineering and Tech
B.Tech-CSE,CSE-IT,AJ: Semester VII (Sch
CET-2
Subject Name: Compiler Design
Subject Code: C2710C2
Time: 10:30-12:00
Date: 2"'9/2023 Maximum Marks: 30
[6]
Q. l. MCQ
i11 the compiler?
(a) Which of the following concept of FSA is used
a) Code optimization
b) Code generation
exical analysis
arser
~
.

I
i
I (b) What is CFG?
a) Regular Expression
I

b) Compiler
I
~an gua ge expression
d) All of the mentioned
iI
ose? I
' (c) Which of the following error can Compiler diagn I !
a) Logical errors only
b) Grammatical and logical errors I I I

.~ra mm atic al errors only I


I d) All
of the mentioned
the following phase of the compiler design?
I

(d) Characters are grouped into tokens in which of


a) ode generator
Lexical analyzer
c) Parser
d) Code optimization

programmer by mistake writes multiplication


(e) Which of the following can detect an error if a
r
instead of divjsion'?
i
, I a) Interpreter I
lI
b) Compiler or interpreter lest I

c) Compiler I
d one of the mentioned l

i The output of the lexical and syntax analy


zer can stated as:
(f) /

a) parse stream. parse tree


I b) token tree. parse tree
token stream, parse tree
I
1

/
u I all of the mentioned
- - - - - - - - - - - -I- - -
1 - - - - - i - - - - - TW ----------------- [06]
-- -- -- 4- --
1

-t -- -- -- -- -- -- -- -- -- -- -- -- --
Q.2. ATTEMPT ANY O OUT OF FOUR
'abb using set construction method only.
j
I i Draw the DFA for the regular expression (a/b)
Explain with example.
l_(b) : How to convt!rl ~egular expression to automata?
-- -- --
-- -'- -- -- -- ~- -- -- -- -- -- -- -- --
\
'- 41' Write down pse~do code for simulat
ing OFA. e.g. (alb)*ubb
'- "(Jr How can we represent NF A in tran
sition table?
\
Q.3. ATTEMPT ANY TWO OUT OF
y FOUR
Draw NFA from regular expression (06]
I
using Thomson's construction and
convert it into DFA.
(a I b)* ab*

(b) Explain symbol table. For what pur


pose, compiler uses symbol table'?
instructions. Explain with C program

(c) What are conflicts in LR Parser?


What are their types? Explain with

-%
an example.
Differentiate between Context-Fre
e Grammars and Regular Expressions.

Q.4. ATTEMPT ANY ONE OUT OF


TWO
[06]
Write down the Algorithm for elim
.< inating left recursion.
% How can we use Reductions techniqu
es for "bottom-up parsing"?

Q.5. ATTEMPT ANY TWO out of FO


UR
[06]
yr~ Differentiate between top down pars
er and bottom up parser.
Draw the "Anno_tated parse tree" for
\.1•
the expression: 3 * 5 + 4 n.
0 Draw the "Annotated parse tree and
three address code for c + a[i] [j]"
""'\. jY Explain the techniques of the "'Pa
rser Generator Yacc".
.

I
I
*****

You might also like