4 research outputs found
A Study of Hierarchical Concatenation Networks in the Area of Pattern Recognition
Hierarchical Concatenation Networks (HCN) are inspired by the way humans recognize patterns; i.e. by concatenating small features. In HCNs patterns are split into small parts, and then concatenated and activated in the networkβs layers. The research in this thesis investigated and explored feature extraction methods, similarity measures, and classification using HCNs. Results indicate that HCNs can be used in automatic pattern recognition systems with better performance rate on the lower layer than the top layer
Seminar Nasional Inovasi Teknologi dan Ilmu Komputer
Seminar Nasional Inovasi Teknologi dan Ilmu Komputer (SNITIK) merupakan acara tahunan yang
diadakan Fakultas Teknologi dan Ilmu Komputer. Acara ini merupakan bagian dari pelaksanaan
Visi Fakultas. Pada tahun ini, SNITIK membawakan tema TECHONOPRENEUR: Bisnis Start up
Digital, dimana tujuanya adalah untuk memperkenalkan teknologi kepada mahasiswa-mahasiswi
dan perkembangan di dunia bisnis saat ini. Salah satu contoh techonopreneur yang saat ini sangat
berkembang adalah techonopreneur di bidang informasi teknologi. Tanpa disadari informasi
teknologi sudah mengubah sudah pola kehidupan klayak banyak misalnya dalam hal memesan tiket
pesawat, pengecekan kesehatan, Dompet digital, pemesanan makanan, pengiriman barang dan
sebagainya. Kebutuhan manusia tidak hanya dicover oleh informasi teknologi, tetapi kebutuhan
manusia membutuhkan perkembangan teknologi yang lain, seperti teknologi pangan, industri, kimia
dan sebagainya. Oleh karena perkembangan zaman dan kebutuhan manusia yang semakin tinggi,
maka diharapkan SNITIK 2019 membuka wawasan dan mendorong peserta untuk terlibat berperan
serta menjadi seorang Technoprenuer
Strategic approaches to learning: an examination of children's problem-solving in early childhood classes
This thesis shows how childrenβs learning is influenced and modified by the teaching environment. The metacognitive, self-regulatory learning behaviours of sixteen kindergarten students were examined in order to determine how students perceive learning, either by adopting deep approaches, where the focus is on understanding and meaning, or surface approaches, where the meeting of institutional demands frequently subjugate the former goals. The data have been analysed within a qualitative paradigm from a phenomenographic perspective. The study addresses three issues: the nature and frequency of the strategic learning behaviours displayed by the students; the contribution strategic behaviours make to the adoption of deep or surface learning approaches; and how metacognitive teaching environments influence higher-order thinking. Findings reveal that where teachers had metcognitive training, the frequency of strategy use increased irrespective of student performance. High achieving students used more strategic behaviours, used them with greater efficiency, and tended to display more of the characteristics of deep approach learners. This study suggests that many of the differential outcomes evident amongst students may be substantially reduced through early and consistent training within a teaching environment conductive to the development of metacognitive, self-regulatory behaviours and deep learning approache
ΠΠ΅ΡΡΡΠΊΠΈΠΉ ΠΊΠ»Π°ΡΠΈΡΡΠΊΠ°ΡΠΎΡ Π³Π»ΠΈΠ±ΠΎΠΊΠΎΠ³ΠΎ Π½Π°Π²ΡΠ°Π½Π½Ρ
It is considered a classification problem solution based on analysys of represented review. Itβs shown that the neural networks has important advantages beside other methods, such as: classification using the nearest neighbor method, support vector classification, classification using decision trees, etc. Amoun of artifisial neural networks exists futher networks have the simplest structure, but the precission of the solution can be increased with help of deep learning approache, which is supposed the use of additional neural network for the solution of pretraining tasks(deep believe networks). Itβs proposed new tophology wich consist of: Takagi-Sugeno-Kang fuzzy classifier and Limited Boltzmann Machine neural network. Despite on this thopology was proposed early in this article itβs carried out enough researches that permited to specify the learning algorithm. An example of proposed algorithm implantation is representedΠ Π΄Π°Π½ΡΠΉ ΡΠΎΠ±ΠΎΡΡ ΡΠΎΠ·Π³Π»ΡΠ½ΡΡΠΎ ΡΡΡΠ΅Π½Π½Ρ ΠΏΡΠΎΠ±Π»Π΅ΠΌΠΈ ΠΊΠ»Π°ΡΠΈΡΡΠΊΠ°ΡΡ. ΠΠΎΠΊΠ°Π·Π°Π½ΠΎ, ΡΠΎ Π½Π΅ΠΉΡΠΎΠ½Π½Ρ ΠΌΠ΅ΡΠ΅ΠΆΡ ΠΌΠ°ΡΡΡ Π²Π°ΠΆΠ»ΠΈΠ²Ρ ΠΏΠ΅ΡΠ΅Π²Π°Π³ΠΈ ΠΏΠΎΡΡΠ΄ Π· ΡΠ½ΡΠΈΠΌΠΈ ΠΌΠ΅ΡΠΎΠ΄Π°ΠΌΠΈ, ΡΠ°ΠΊΠΈΠΌΠΈ ΡΠΊ: ΠΊΠ»Π°ΡΠΈΡΡΠΊΠ°ΡΡΡ Π· Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½ΡΠΌ ΠΌΠ΅ΡΠΎΠ΄Ρ Π½Π°ΠΉΠ±Π»ΠΈΠΆΡΠΎΠ³ΠΎ ΡΡΡΡΠ΄Π°, ΠΊΠ»Π°ΡΠΈΡΡΠΊΠ°ΡΡΡ Π·Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³ΠΎΡ Π²Π΅ΠΊΡΠΎΡΡΠ² ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠΈ, ΠΊΠ»Π°ΡΠΈΡΡΠΊΠ°ΡΡΡ Π· Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½ΡΠΌ Π΄Π΅ΡΠ΅Π² ΡΡΡΠ΅Π½Ρ, ΡΠΎΡΠΎ. Π ΡΡΠ΅Π½Π½Ρ ΠΌΠΎΠΆΠ΅ Π±ΡΡΠΈ ΡΠΎΠ·ΡΠΈΡΠ΅Π½ΠΎ Π·Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³ΠΎΡ Π³Π»ΠΈΠ±ΠΎΠΊΠΎΠ³ΠΎ ΠΏΡΠ΄Ρ
ΠΎΠ΄Ρ Π΄ΠΎ Π½Π°Π²ΡΠ°Π½Π½Ρ, ΡΠΎ ΠΏΠ΅ΡΠ΅Π΄Π±Π°ΡΠ°Ρ Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½Ρ Π΄ΠΎΠ΄Π°ΡΠΊΠΎΠ²ΠΎΡ Π½Π΅ΠΉΡΠΎΠ½Π½ΠΎΡ ΠΌΠ΅ΡΠ΅ΠΆΡ (Π³Π»ΠΈΠ±ΠΎΠΊΡ ΠΌΠ΅ΡΠ΅ΠΆΡ) Π΄Π»Ρ Π²ΠΈΡΡΡΠ΅Π½Π½Ρ Π·Π°Π΄Π°ΡΡ ΠΏΠΎΠΏΠ΅ΡΠ΅Π΄Π½ΡΠΎΠ³ΠΎ Π½Π°Π²ΡΠ°Π½Π½Ρ. ΠΠ°ΠΏΡΠΎΠΏΠΎΠ½ΠΎΠ²Π°Π½Π° Π½ΠΎΠ²Π° ΡΠΎΡΠΎΠ»ΠΎΠ³ΡΡ ΡΠΊΠ»Π°Π΄Π°ΡΡΡΡΡ Π·: Π½Π΅ΡΡΡΠΊΠΈΡ
ΠΊΠ»Π°ΡΠΈΡΡΠΊΠ°ΡΠΎΡΡΠ² Π’Π°ΠΊΠ°Π³Ρ-Π‘ΡΠ³Π΅Π½ΠΎ-ΠΠ°Π½Π³Π° Ρ Π½Π΅ΠΉΡΠΎΠ½Π½ΠΎΡ ΠΌΠ΅ΡΠ΅ΠΆΡ ΠΎΠ±ΠΌΠ΅ΠΆΠ΅Π½ΠΎΡ ΠΌΠ°ΡΠΈΠ½ΠΈ ΠΠΎΠ»ΡΡΠΌΠ°Π½Π°. ΠΠ΅Π·Π²Π°ΠΆΠ°ΡΡΠΈ Π½Π° ΡΠ΅, ΡΠΎ ΡΡ ΡΠΎΠΏΠΎΠ»ΠΎΠ³ΡΡ Π±ΡΠ»Π° Π·Π°ΠΏΡΠΎΠΏΠΎΠ½ΠΎΠ²Π°Π½Π° ΡΠ°Π½ΡΡΠ΅, Π² ΡΡΠΉ ΡΡΠ°ΡΡΡ Π±ΡΠ»ΠΎ ΠΏΡΠΎΠ²Π΅Π΄Π΅Π½ΠΎ Π΄ΠΎΡΡΠ°ΡΠ½ΡΠΎ Π΄ΠΎΡΠ»ΡΠ΄ΠΆΠ΅Π½Ρ, ΡΠΊΡ Π΄ΠΎΠ·Π²ΠΎΠ»ΠΈΠ»ΠΈ ΡΡΠ²ΠΎΡΠΈΡΠΈ Π°Π»Π³ΠΎΡΠΈΡΠΌ Π½Π°Π²ΡΠ°Π½Π½Ρ. ΠΠ°Π²Π΅Π΄Π΅Π½ΠΎ ΠΏΡΠΈΠΊΠ»Π°Π΄ Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½Ρ Π·Π°ΠΏΡΠΎΠΏΠΎΠ½ΠΎΠ²Π°Π½ΠΎΠ³ΠΎ Π°Π»Π³ΠΎΡΠΈΡΠΌΡΠ Π°ΡΡΠΌΠΎΡΡΠ΅Π½ΠΎ ΡΠ΅ΡΠ΅Π½ΠΈΠ΅ ΠΏΡΠΎΠ±Π»Π΅ΠΌΡ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΈΠΈ Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ Π°Π½Π°Π»ΠΈΠ·Π° ΠΏΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½Π½ΠΎΠ³ΠΎ ΠΎΠ±Π·ΠΎΡΠ°. ΠΠΎΠΊΠ°Π·Π°Π½ΠΎ, ΡΡΠΎ Π½Π΅ΠΉΡΠΎΠ½Π½ΡΠ΅ ΡΠ΅ΡΠΈ ΠΎΠ±Π»Π°Π΄Π°ΡΡ Π²Π°ΠΆΠ½ΡΠΌΠΈ ΠΏΡΠ΅ΠΈΠΌΡΡΠ΅ΡΡΠ²Π°ΠΌΠΈ ΠΏΠΎ ΡΡΠ°Π²Π½Π΅Π½ΠΈΡ Ρ Π΄ΡΡΠ³ΠΈΠΌΠΈ ΠΌΠ΅ΡΠΎΠ΄Π°ΠΌΠΈ, ΡΠ°ΠΊΠΈΠΌΠΈ ΠΊΠ°ΠΊ: ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΈΡ Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΌΠ΅ΡΠΎΠ΄Π° Π±Π»ΠΈΠΆΠ°ΠΉΡΠ΅Π³ΠΎ ΡΠΎΡΠ΅Π΄Π°, ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΈΡ Π²Π΅ΠΊΡΠΎΡΠΎΠ² ΠΏΠΎΠ΄Π΄Π΅ΡΠΆΠΊΠΈ, ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΈΡ Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ Π΄Π΅ΡΠ΅Π²ΡΠ΅Π² ΡΠ΅ΡΠ΅Π½ΠΈΠΉ ΠΈ Ρ. Π΄. Π‘ΡΡΠ΅ΡΡΠ²ΡΠ΅Ρ ΠΌΠ½ΠΎΠΆΠ΅ΡΡΠ²ΠΎ ΠΈΡΠΊΡΡΡΡΠ²Π΅Π½Π½ΡΡ
Π½Π΅ΠΉΡΠΎΠ½Π½ΡΡ
ΡΠ΅ΡΠ΅ΠΉ, ΠΊΠΎΡΠΎΡΡΠ΅ ΠΈΠΌΠ΅ΡΡ ΠΏΡΠΎΡΡΠ΅ΠΉΡΡΡ ΡΡΡΡΠΊΡΡΡΡ, Π½ΠΎ ΡΠΎΡΠ½ΠΎΡΡΡ ΡΠ΅ΡΠ΅Π½ΠΈΡ ΠΌΠΎΠΆΠ΅Ρ Π±ΡΡΡ ΡΠ²Π΅Π»ΠΈΡΠ΅Π½Π° Ρ ΠΏΠΎΠΌΠΎΡΡΡ ΠΏΠΎΠ΄Ρ
ΠΎΠ΄Π° Π³Π»ΡΠ±ΠΎΠΊΠΎΠ³ΠΎ ΠΎΠ±ΡΡΠ΅Π½ΠΈΡ, ΠΊΠΎΡΠΎΡΡΠΉ ΠΏΡΠ΅Π΄ΠΏΠΎΠ»Π°Π³Π°Π΅Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ Π΄ΠΎΠΏΠΎΠ»Π½ΠΈΡΠ΅Π»ΡΠ½ΠΎΠΉ Π½Π΅ΠΉΡΠΎΠ½Π½ΠΎΠΉ ΡΠ΅ΡΠΈ Π΄Π»Ρ ΡΠ΅ΡΠ΅Π½ΠΈΡ Π·Π°Π΄Π°Ρ ΠΏΡΠ΅Π΄Π²Π°ΡΠΈΡΠ΅Π»ΡΠ½ΠΎΠΉ ΠΏΠΎΠ΄Π³ΠΎΡΠΎΠ²ΠΊΠΈ (ΡΠ΅ΡΠΈ Π³Π»ΡΠ±ΠΎΠΊΠΎΠ³ΠΎ ΠΎΠ±ΡΡΠ΅Π½ΠΈΡ). ΠΡΠ΅Π΄Π»ΠΎΠΆΠ΅Π½Π° Π½ΠΎΠ²Π°Ρ ΡΠΎΠΏΠΎΠ»ΠΎΠ³ΠΈΡ, ΠΊΠΎΡΠΎΡΠ°Ρ ΡΠΎΡΡΠΎΠΈΡ ΠΈΠ·: Π½Π΅ΡΠ΅ΡΠΊΠΎΠ³ΠΎ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΎΡΠ° Π’Π°ΠΊΠ°Π³ΠΈβΠ‘ΡΠ³Π΅Π½ΠΎβΠΠ°Π½Π³Π° ΠΈ Π½Π΅ΠΉΡΠΎΠ½Π½ΠΎΠΉ ΡΠ΅ΡΠΈ ΠΎΠ³ΡΠ°Π½ΠΈΡΠ΅Π½Π½ΠΎΠΉ ΠΌΠ°ΡΠΈΠ½Ρ ΠΠΎΠ»ΡΡΠΌΠ°Π½Π°. ΠΠ΅ΡΠΌΠΎΡΡΡ Π½Π° ΡΠΎ, ΡΡΠΎ ΡΡΠ° ΡΠΎΠΏΠΎΠ»ΠΎΠ³ΠΈΡ Π±ΡΠ»Π° ΠΏΡΠ΅Π΄Π»ΠΎΠΆΠ΅Π½Π° Π² ΡΡΠΎΠΉ ΡΡΠ°ΡΡΠ΅, Π±ΡΠ»ΠΎ ΠΏΡΠΎΠ²Π΅Π΄Π΅Π½ΠΎ Π΄ΠΎΡΡΠ°ΡΠΎΡΠ½ΠΎ ΠΈΡΡΠ»Π΅Π΄ΠΎΠ²Π°Π½ΠΈΠΉ, ΠΊΠΎΡΠΎΡΡΠ΅ ΠΏΠΎΠ·Π²ΠΎΠ»ΠΈΠ»ΠΈ ΡΠΎΠ·Π΄Π°ΡΡ Π½ΠΎΠ²ΡΠΉ Π°Π»Π³ΠΎΡΠΈΡΠΌ ΠΎΠ±ΡΡΠ΅Π½ΠΈΡ. ΠΡΠΈΠ²Π΅Π΄Π΅Π½ ΠΏΡΠΈΠΌΠ΅Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΡ ΠΏΡΠ΅Π΄Π»ΠΎΠΆΠ΅Π½Π½ΠΎΠ³ΠΎ Π°Π»Π³ΠΎΡΠΈΡΠΌ