Yazar "Sen, Baha" seçeneğine göre listele
Listeleniyor 1 - 20 / 30
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe Analysing the success of level determination exam according to the school type and lesson type that is represented by questions in the exam(Elsevier Science Bv, 2013) Cavusoglu, Abdullah; Sen, Baha; Ucar, Emine; Ucar, MuratThe purpose of this study is to determine if there is any relation between level determination exam (LDE) point and the type of school and the points that student got from those lessons which are represented by questions in the exam. Randomly selected data belongs to 8th degree primary school students among whole Turkey are used as the sample data. To investigate the significance level of relationship between two variables independent of each other, first of all the correlation values between two variables has been examined and then some statistical tests has been applied. As a result, it has been observed that there is a strong, meaningful and positive relation between the achievement of students on the lessons which are represented by questions in the exam and the LDE points. Furthermore, it has been observed that the type of school that students attend is also effective to the success and it has been seen that the students who attend private schools are more successful than the students in public schools. (C) 2013 The Authors. Published by Elsevier Ltd.Öğe Analysis of Demographic Characteristics Creating Coronary Artery Disease Susceptibility using Random Forests Classifier(Elsevier Science Bv, 2015) Akyol, Kemal; Calik, Elif; Bayir, Safak; Sen, Baha; Cavusoglu, AbdullahCardiovascular system diseases are an important health problem. These diseases are very common also responsible for many deaths. With this study, it is aimed to analyze factors that cause Coronary Artery Disease using Random Forests Classifier. According to the analysis, we observed correct classification ratio and performance measure that creates susceptibility to Coronary Artery Disease for each factor. The performance measure results clearly show the impact of demographic characteristics on CAD. Additionally, this study shows that random forests algorithm can be used to the processing and classification of medical data such as CAD. (C) 2015 The Authors. Published by Elsevier B.V.Öğe Assessing the importance of features for detection of hard exudates in retinal images(Tubitak Scientific & Technological Research Council Turkey, 2017) Akyol, Kemal; Sen, Baha; Bayir, Safak; Cakmak, Hasan BasriDiabetes disrupts the operation of the eye and leads to vision loss, affecting particularly the nerve layer and capillary vessels in this layer by changes in the blood vessels of the retina. Suddenly loss and blurred vision problems occur in the image, depending on the phase of the disease, called diabetic retinopathy. Hard exudates are one of the primary signs of diabetic retinopathy. Automatic recognition of hard exudates in retinal images can contribute to detection of the disease. We present an automatic screening system for the detection of hard exudates. This system consists of two main steps. Firstly, the features were extracted from patch images consisting of hard exudate and normal regions using the DAISY algorithm based on the histogram of oriented gradients. After, we utilized the recursive feature elimination (RFE) method, using logistic regression (LR) and support vector classifier (SVC) estimators on the raw dataset. Therefore, we obtained two datasets containing the most important features. The number of important features in each dataset created with LR and SVC was 126 and 259, respectively. Afterward, we observed different classifier algorithms' performances by using 5-fold cross validation on these important features' dataset and it was observed that the random forest (RF) classifier is the best classifier. Secondly, we obtained important features from the feature vector that corresponds with the region of interest in accordance with the keypoint information in a new retinal fundus image. Then we performed detection of hard exudate regions on the retinal fundus image by using the RF classifier.Öğe Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques(Hindawi Ltd, 2016) Akyol, Kemal; Sen, Baha; Bayir, SafakWith the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC.Öğe A Comparative Study on Classification of Sleep Stage Based on EEG Signals Using Feature Selection and Classification Algorithms(Springer, 2014) Sen, Baha; Peker, Musa; Cavusoglu, Abdullah; Celebi, Fatih V.Sleep scoring is one of the most important diagnostic methods in psychiatry and neurology. Sleep staging is a time consuming and difficult task undertaken by sleep experts. This study aims to identify a method which would classify sleep stages automatically and with a high degree of accuracy and, in this manner, will assist sleep experts. This study consists of three stages: feature extraction, feature selection from EEG signals, and classification of these signals. In the feature extraction stage, it is used 20 attribute algorithms in four categories. 41 feature parameters were obtained from these algorithms. Feature selection is important in the elimination of irrelevant and redundant features and in this manner prediction accuracy is improved and computational overhead in classification is reduced. Effective feature selection algorithms such as minimum redundancy maximum relevance (mRMR); fast correlation based feature selection (FCBF); ReliefF; t-test; and Fisher score algorithms are preferred at the feature selection stage in selecting a set of features which best represent EEG signals. The features obtained are used as input parameters for the classification algorithms. At the classification stage, five different classification algorithms (random forest (RF); feed-forward neural network (FFNN); decision tree (DT); support vector machine (SVM); and radial basis function neural network (RBF)) classify the problem. The results, obtained from different classification algorithms, are provided so that a comparison can be made between computation times and accuracy rates. Finally, it is obtained 97.03 % classification accuracy using the proposed method. The results show that the proposed method indicate the ability to design a new intelligent assistance sleep scoring system.Öğe A cooperative GPU-based Parallel Multistart Simulated Annealing algorithm for Quadratic Assignment Problem(Elsevier - Division Reed Elsevier India Pvt Ltd, 2018) Sonuc, Emrullah; Sen, Baha; Bayir, SafakGPU hardware and CUDA architecture provide a powerful platform to develop parallel algorithms. Implementation of heuristic and metaheuristic algorithms on GPUs are limited in literature. Nowadays developing parallel algorithms on GPU becomes very important. In this paper, NP-Hard Quadratic Assignment Problem (QAP) that is one of the combinatorial optimization problems is discussed. Parallel Multistart Simulated Annealing (PMSA) method is developed with CUDA architecture to solve QAP. An efficient method is developed by providing multistart technique and cooperation between threads. The cooperation is occurred with threads in both the same and different blocks. This paper focuses on both acceleration and quality of solutions. Computational experiments conducted on many Quadratic Assignment Problem Library (QAPLIB) instances. The experimental results show that PMSA runs up to 29x faster than a single-core CPU and acquires best known solution in a short time in many benchmark datasets. (C) 2018 Karabuk University. Publishing services by Elsevier B.V.Öğe A Decision Support System for Early-Stage Diabetic Retinopathy Lesions(Science & Information Sai Organization Ltd, 2017) Akyol, Kemal; Bayir, Safak; Sen, BahaRetina is a network layer containing light-sensitive cells. Diseases that occur in this layer, which performs the eyesight, threaten our eye-sight directly. Diabetic Retinopathy is one of the main complications of diabetes mellitus and it is the most significant factor contributing to blindness in the later stages of the disease. Therefore, early diagnosis is of great importance to prevent the progress of this disease. For this purpose, in this study, an application based on image processing techniques and machine learning, which provides decision support to specialist, was developed for the detection of hard exudates, cotton spots, hemorrhage and microaneurysm lesions which appear in the early stages of the disease. The meaningful information was extracted from a set of samples obtained from the DIARETDB1 dataset during the system modeling process. In this process, Gabor and Discrete Fourier Transform attributes were utilized and dimension reduction was performed by using Spectral Regression Discriminant Analysis algorithm. Then, Random Forest and Logistic Regression and classifier algorithms' performances were evaluated on each attribute dataset. Experimental results were obtained using the retinal fundus images provided from both DIARETDB1 dataset and the department of Ophthalmology, Ataturk Training and Research Hospital in Ankara.Öğe Early-exit Optimization Using Mixed Norm Despeckling for SAR Images(Ieee, 2015) Ozcan, Caner; Sen, Baha; Nar, FatihSpeckle noise which is inherent to Synthetic Aperture Radar (SAR) imaging obstructs various image exploitation tasks such as edge detection, segmentation, change detection, and target recognition. Speckle reduction is generally used as a first step which has to smooth out homogeneous regions while preserving edges and point scatterers. In remote sensing applications, efficiency of computational load and memory consumption of despeckling must be improved for SAR images. In this paper, an early-exit total variation approach is proposed and this approach combines the l(1)-norm and the l(2)-norm in order to improve despeckling quality while keeping execution times of algorithm reasonably short. Speckle reduction performance, execution time and memory consumption are shown using spot mode SAR images.Öğe An efficient Pseudo microprocessor for engineering education(Elsevier Science Bv, 2012) Gorgunoglu, Salih; Peker, Musa; Sen, Baha; Cavusoglu, AbdullahComputer architecture, computer organization and digital circuits are among the basic topics that are taught in computer sciences and engineering. Because of the degree of abstraction, students often find it difficult to comprehend the subject. Computer architecture and digital circuits are bases for microprocessors and microcontrollers which are widely used in the controlling of the systems in the industry. Therefore, It is necessary to know and understand the structure and programming of the microprocessors in order to be used effectively in applications. In this study, an educational tool for simulating execution of a simple microprocessors commands are presented. This simple microprocessors is in fact an imaginary microprocessors which does not exists commercially. However, it is custom made to fit our educational training purposes. The simulation platform is realized by a visual programming platform. By this study, we have exercised an improvement over the student's understanding of the topics which has been thought in our engineering education classes. (C) 2011 Published by Elsevier Ltd.Öğe An efficient solving of the traveling salesman problem: the ant colony system having parameters optimized by the Taguchi method(Tubitak Scientific & Technological Research Council Turkey, 2013) Peker, Musa; Sen, Baha; Kumru, Pinar YildizOwing to its complexity, the traveling salesman problem (TSP) is one of the most intensively studied problems in computational mathematics. The TSP is defined as the provision of minimization of total distance, cost, and duration by visiting the n number of points only once in order to arrive at the starting point. Various heuristic algorithms used in many fields have been developed to solve this problem. In this study, a solution was proposed for the TSP using the ant colony system and parameter optimization was taken from the Taguchi method. The implementation was tested by various data sets in the Traveling Salesman Problem Library and a performance analysis was undertaken. In addition to these, a variance analysis was undertaken in order to identify the effect values of the parameters on the system. Implementation software was developed using the MATLAB program, which has a useful interface and simulation support.Öğe Evaluating the achievements of computer engineering department of distance education students with data mining methods(Elsevier Science Bv, 2012) Sen, Baha; Ucar, EmineRecently, the internet technology has become an indispensable part of life, a very useful application that cannot be earlier have made it possible. One of these is distance learning technologies. Due to limitations of traditional learning-teaching methods in classroom activities and practitioners who intend to conduct training activities in the absence of the possibility of communication and interaction among learners with special education units are prepared and provided a wide range of media center through a certain method of teaching. According to a further recognition of Distance Education, although far away from each other with the student who teaches the same time (synchronous) or different time (asynchronous) communications with a tool as training system established. The aim of this study is to compare the achievements of Computer Engineering Department students in Karabuk University according to criteria such as age, gender, type of high school graduation and whether the students studying in distance education or regular education using data mining techniques. Also discussing the differences of the techniques according to the results and to make suggestions for which technique would be more effective. (C) 2011 Published by Elsevier Ltd.Öğe Expert system-based educational tool for practice lessons in a vocational high school(Sila Science, 2012) Peker, Musa; Sen, BahaExpert systems are used as supplementary tools in various decision making activities. Expert system software that can bring solutions to hardware and software problems has been designed as part of this study. The study aimed to determine whether or not there is a statistical significance between studying 11th grade System Maintenance and Repair classes with traditional method and with expert system supported teaching method in terms of their effect on students' implementation skills. Two analyses were undertaken with regard to performance assessment. When the results were examined, it was seen that the expert system supported teaching methods and was more effective than traditional methods and led to a significant increase in student success. It is contended that these systems may be beneficial with regard to increasing student achievement and decreasing the time needed for training in applied classes in vocational schools.Öğe Fast Feature Preserving Despeckling(Ieee, 2014) Ozcan, Caner; Sen, Baha; Nar, FatihSynthetic Aperture Radar (SAR) images contain high amount of speckle noise which causes edge detection, shape analysis, classification, segmentation, change detection and target recognition tasks become more difficult. To overcome such difficulties, smoothing of homogenous regions while preserving point scatterers and edges during speckle reduction is quite important. Besides, due to huge size of SAR images in remote sensing applications efficiency of computational load and memory consumption must be further improved. In this paper, a parallel computational approach is proposed for the Feature Preserving Despeckling (FPD) method which is chosen due to its success in speckle reduction. Speckle reduction performance, execution time and memory consumption of the proposed Fast FPD (FFPD) method is shown using spot mode SAR images.Öğe GPU efficient SAR image despeckling using mixed norms(Spie-Int Soc Optical Engineering, 2014) Ozcan, Caner; Sen, Baha; Nar, FatihSpeckle noise which is inherent to Synthetic Aperture Radar (SAR) imaging obstructs various image exploitation tasks such as edge detection, segmentation, change detection, and target recognition. Therefore, speckle reduction is generally used as a first step which has to smooth out homogeneous regions while preserving edges and point scatterers. Traditional speckle reduction methods are fast and their memory consumption is insignificant. However, they are either good at smoothing homogeneous regions or preserving edges and point scatterers. State of the art despeckling methods are proposed to overcome this trade-off. However, they introduce another trade-off between denoising quality and resource consumption, thereby higher denoising quality requires higher computational load and/or memory consumption. In this paper, a local pixel-based total variation (TV) approach is proposed, which combines l(2)-norm and l(1)-norm in order to improve despeckling quality while keeping execution times reasonably short. Pixel-based approach allows efficient computation model with relatively low memory consumption. Their parallel implementations are also more efficient comparing to global TV approaches which generally require numerical solution of sparse linear systems. However, pixel-based approaches are trapped to local minima frequently hence despeckling quality is worse comparing to global TV approaches. Proposed method, namely mixed norm despeckling (MND), combines l(2)-norm and l(1)-norm in order to improve despeckling performance by alleviating local minima problem. All steps of the MND are parallelized using OpenMP on CPU and CUDA on GPU. Speckle reduction performance, execution time and memory consumption of the proposed method are shown using synthetic images and TerraSAR-X spot mode SAR images.Öğe A hybrid CNN-LSTM model for pre-miRNA classification(Nature Portfolio, 2021) Tasdelen, Abdulkadir; Sen, BahamiRNAs (or microRNAs) are small, endogenous, and noncoding RNAs construct of about 22 nucleotides. Cumulative evidence from biological experiments shows that miRNAs play a fundamental and important role in various biological processes. Therefore, the classification of miRNA is a critical problem in computational biology. Due to the short length of mature miRNAs, many researchers are working on precursor miRNAs (pre-miRNAs) with longer sequences and more structural features. Pre-miRNAs can be divided into two groups as mirtrons and canonical miRNAs in terms of biogenesis differences. Compared to mirtrons, canonical miRNAs are more conserved and easier to be identified. Many existing pre-miRNA classification methods rely on manual feature extraction. Moreover, these methods focus on either sequential structure or spatial structure of pre-miRNAs. To overcome the limitations of previous models, we propose a nucleotide-level hybrid deep learning method based on a CNN and LSTM network together. The prediction resulted in 0.943 (%95 CI +/- 0.014) accuracy, 0.935 (%95 CI +/- 0.016) sensitivity, 0.948 (%95 CI +/- 0.029) specificity, 0.925 (%95 CI +/- 0.016) F1 Score and 0.880 (%95 CI +/- 0.028) Matthews Correlation Coefficient. When compared to the closest results, our proposed method revealed the best results for Acc., F1 Score, MCC. These were 2.51%, 1.00%, and 2.43% higher than the closest ones, respectively. The mean of sensitivity ranked first like Linear Discriminant Analysis. The results indicate that the hybrid CNN and LSTM networks can be employed to achieve better performance for pre-miRNA classification. In future work, we study on investigation of new classification models that deliver better performance in terms of all the evaluation criteria.Öğe Improvement of Radial basis Function Interpolation Performance on Cranial Implant Design(Science & Information Sai Organization Ltd, 2017) Atasoy, Ferhat; Sen, Baha; Nar, Fatih; Bozkurt, IsmailCranioplasty is a neurosurgical operation for repairing cranial defects that have occurred in a previous operation or trauma. Various methods have been presented for cranioplasty from past to present. In computer-aided design based methods, quality of an implant depends on operator's talent. In mathematical model based methods, such as curve-fitting and various interpolations, healthy parts of a skull are used to generate implant model. Researchers have studied to improve performance of mathematical models which are independent from operators' talent. In this study, improvement of radial basis function (RBF) interpolation performance using symmetrical data is presented. Since we focused on the improvement of RBF interpolation performance on cranial implant design, results were compared with previous studies involving the same technique. In comparison with previously presented results, difference between the computed implant model and the original skull was reduced from 7 mm to 2 mm using newly proposed approach.Öğe Investigation of the performance of LU decomposition method using CUDA(Elsevier Science Bv, 2012) Ozcan, Caner; Sen, BahaIn recent years, parallel processing has been widely used in the computer industry. Software developers, have to deal with parallel computing platforms and technologies to provide novel and rich experiences. We present a novel algorithm to solve dense linear systems using Compute Unified Device Architecture (CUDA). High-level linear algebra operations require intensive computation. In this study Graphics Processing Units (GPU) accelerated implementation of LU linear algebra routine is implemented. LU decomposition is a decomposition of the form A=LU where A is a square matrix. The main idea of the LU decomposition is to record the steps used in Gaussian elimination on A in the places where the zero is produced. L and U are lower and upper triangular matrices respectively. This means that L has only zeros above the diagonal and U has only zeros below the diagonal. We have worked to increase performance with proper data representation and reducing row operations on GPU. Because of the high arithmetic throughput of GPUs, initial results from experiments promised a bright future for GPU computing. It has been shown useful for scientific computations. GPUs have high memory bandwidth and more floating point units as compared to the CPU. We have tried our study on different systems that have different GPUs and CPUs. The computation studies were also evaluated for different linear systems. When we compared the results obtained from both systems, a better performance was obtained with GPU computing. According to results, GPU computation approximately worked 3 times faster than the CPU computation. Our implementation provides significant performance improvement so we can easily use it to solve dense linear system. (C) 2011 Published by Elsevier Ltd.Öğe Novel approaches for automated epileptic diagnosis using FCBF selection and classification algorithms(Tubitak Scientific & Technological Research Council Turkey, 2013) Sen, Baha; Peker, MusaThis paper presents a new application for automated epileptic detection using the fast correlation-based feature (FCBF) selection and classification algorithms. This study consists of 3 stages: feature extraction, feature selection from electroencephalography (EEG) signals, and the classification of these signals. In the feature extraction phase, 16 attribute algorithms are used in 5 categories, and 36 feature parameters are obtained from these algorithms. In the feature selection phase, the FCBF algorithm is chosen to select a set of attributes that best represent the EEG signals. The resulting attributes are used as input parameters for the classification algorithms. In the classification phase, the problem is classified with 6 different classification algorithms. The results obtained with the different classification algorithms are provided in order to compare the calculation times and the accuracy rates. The evolution of the proposed system is conducted using k-fold cross-validation, classification accuracy, sensitivity and specificity values, and a confusion matrix. The proposed approach enables 100% classification accuracy with the use of the multilayer perceptron neural network and naive Bayes algorithm. The stated results show that the proposed method is capable of designing a new intelligent assistance diagnostic system.Öğe A Novel Classification and Estimation Approach for Detecting Keratoconus Disease with Intelligent Systems(Ieee, 2013) Ucar, Murat; Sen, Baha; Cakmak, Hasan BasriKeratoconus is an eye disease characterized by progressive thinning of cornea which is the front based transparent layer of the eye. In other words, it is a progressive distortion of corneal layer and at least getting conical shape that should be like a dome camber. The vision reduces more and more while cornea gets shape of cone which should be like a sphere normally. The aim of this study is to define a new classification method for detecting keratoconus based on statistical analysis and to realize the prediction of these classified data with intelligent systems. 301 eyes of 159 patients and 394 eyes of 265 refractive surgery candidates as the control group have been used for this study. Factor analysis, one of the multivariate statistical techniques, has been mainly used to find more meaningful, easy to understand, and independent factors amongst the others. Later, a new classification method has been defined using clustering analysis techniques on these factors and finally estimated by using artificial neural networks and support vector machines.Öğe A Parallel Simulated Annealing Algorithm for Weapon-Target Assignment Problem(Science & Information Sai Organization Ltd, 2017) Sonuc, Emrullah; Sen, Baha; Bayir, SafakWeapon-target assignment (WTA) is a combinatorial optimization problem and is known to be NP-complete. The WTA aims to best assignment of weapons to targets to minimize the total expected value of the surviving targets. Exact methods can solve only small-size problems in a reasonable time. Although many heuristic methods have been studied for the WTA in the literature, a few parallel methods have been proposed. This paper presents parallel simulated algorithm (PSA) to solve the WTA. The PSA runs on GPU using CUDA platform. Multi-start technique is used in PSA to improve quality of solutions. 12 problem instances (up to 200 weapons and 200 targets) generated randomly are used to test the effectiveness of the PSA. Computational experiments show that the PSA outperforms SA on average and runs up to 250x faster than a single-core CPU.